Search results

11 – 20 of over 138000
Open Access
Article
Publication date: 5 December 2023

Liqun Hu, Tonghui Wang, David Trafimow, S.T. Boris Choy, Xiangfei Chen, Cong Wang and Tingting Tong

The authors’ conclusions are based on mathematical derivations that are supported by computer simulations and three worked examples in applications of economics and finance…

Abstract

Purpose

The authors’ conclusions are based on mathematical derivations that are supported by computer simulations and three worked examples in applications of economics and finance. Finally, the authors provide a link to a computer program so that researchers can perform the analyses easily.

Design/methodology/approach

Based on a parameter estimation goal, the present work is concerned with determining the minimum sample size researchers should collect so their sample medians can be trusted as good estimates of corresponding population medians. The authors derive two solutions, using a normal approximation and an exact method.

Findings

The exact method provides more accurate answers than the normal approximation method. The authors show that the minimum sample size necessary for estimating the median using the exact method is substantially smaller than that using the normal approximation method. Therefore, researchers can use the exact method to enjoy a sample size savings.

Originality/value

In this paper, the a priori procedure is extended for estimating the population median under the skew normal settings. The mathematical derivation and with computer simulations of the exact method by using sample median to estimate the population median is new and a link to a free and user-friendly computer program is provided so researchers can make their own calculations.

Details

Asian Journal of Economics and Banking, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2615-9821

Keywords

Article
Publication date: 16 October 2009

Mark Schreiner

The purpose of this paper is to provide a rigorous, statistically correct, and low‐cost way to audit sample a lender's loan portfolio, be they a microlender or other type of…

1552

Abstract

Purpose

The purpose of this paper is to provide a rigorous, statistically correct, and low‐cost way to audit sample a lender's loan portfolio, be they a microlender or other type of lender. No other paper applies this method to loan portfolios, even though it is a high demand application.

Design/methodology/approach

Standard techniques of audit sampling and dollar unit sampling with stratification are applied to the particular case of a microlender's portfolio. Unlike the audit sampling that almost all auditors use, no arbitrary rules of thumb are applied.

Findings

The paper finds that statistical audit sampling for a lender's loan portfolio is simple, rigorous, and inexpensive.

Practical implications

In audit sampling, most auditors use arbitrary rules of thumb and have no idea whether they are sampling enough items to actually be sure, with some desired level of confidence, that they have found no defects. This simple, inexpensive, and statistically rigorous technique will allow auditors who actually want to do a good job to quantify the precision of their statements in a very common application.

Originality/value

This paper combines several disparate threads from the statistical literature on audit sampling in a way that auditors (who are usually not statisticians) can apply them for auditing the quality of a lender's portfolio – microfinance or otherwise – which is a very common need.

Details

Managerial Finance, vol. 35 no. 12
Type: Research Article
ISSN: 0307-4358

Keywords

Article
Publication date: 23 January 2019

Barry Cobb and Linda Li

Bayesian networks (BNs) are implemented for monitoring a process via statistical process control (SPC) where attribute data are available on output from the system. The paper aims…

Abstract

Purpose

Bayesian networks (BNs) are implemented for monitoring a process via statistical process control (SPC) where attribute data are available on output from the system. The paper aims to discuss this issue.

Design/methodology/approach

The BN provides a graphical and numerical tool to help a manager understand the effect of sample observations on the probability that the process is out-of-control and requires investigation. The parameters for the BN SPC model are statistically designed to minimize the out-of-control average run length (ARL) of the process at a specified in-control ARL and sample size.

Findings

The BN model outperforms adaptive np control charts in all experiments, except for some cases where only a large change in the proportion of sample defects is relevant. The BN is particularly useful when small sample sizes are available and when managers need to detect small changes in the proportion of defects produced by the process.

Research limitations/implications

The BN model is statistically designed and parameters are chosen to minimize out-of-control ARL. Future advancements will address the economic design of BNs for SPC with attribute data.

Originality/value

The BNs allow qualitative knowledge to be combined with sample data, and the average percentage of defects can be modeled as a continuous random variable. The framework of the BN easily permits classification of the system operation into two or more states, so diagnostic analysis can be performed simultaneously with statistical inference.

Details

International Journal of Quality & Reliability Management, vol. 36 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 10 August 2015

D. R. Prajapati and Sukhraj Singh

The purpose of this paper is to counter autocorrelation by designing the chart, using warning limits. Various optimal schemes of modified chart are proposed for various sample

Abstract

Purpose

The purpose of this paper is to counter autocorrelation by designing the chart, using warning limits. Various optimal schemes of modified chart are proposed for various sample sizes (n) at levels of correlation (Φ) of 0.00, 0.475 and 0.95. These optimal schemes of modified chart are compared with the double sampling (DS) chart, suggested by Costa and Claro (2008).

Design/methodology/approach

The performance of the chart is measured in terms of the average run length (ARL) that is the average number of samples before getting an out-of-control signal. Ultimately, due to the effect of autocorrelation among the data, the performance of the chart is suspected. The ARLs at various sets of parameters of the chart are computed by simulation, using MATLAB. The suggested optimal schemes are simpler schemes with limited number of parameters and smaller sample size (n=4) and this simplicity makes them very helpful in quality control.

Findings

The suggested optimal schemes of modified chart are compared with the DS chart, suggested by Costa and Claro (2008). It is concluded that the modified chart outperforms the DS chart at various levels of correlation (Φ) and shifts in the process mean. The simplicity in the design of modified chart, makes it versatile for many industries.

Research limitations/implications

Both the schemes are optimized by assuming the normal distribution. But this assumption may also be relaxed to design theses schemes for autocorrelated data. The optimal schemes for chart can be developed for variable sample size and for variable sampling intervals. The optimal schemes can also be explored for cumulative sum and exponentially weighted moving average charts.

Practical implications

The correlation among the process outputs of any industry can be find out and corresponding to that level of correlation the suggested control chart parameters can be applied. The understandable and robust design of modified chart makes it usable for industrial quality control.

Social implications

The rejection level of products in the industries can be reduced by designing the better control chart schemes which will also reduce the loss to the society, as suggested by Taguchi (1985).

Originality/value

Although it is the extension of previous work but it can be applied to various manufacturing industries as well as service industries, where the data are positively correlated and normally distributed.

Details

The TQM Journal, vol. 27 no. 5
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 1 March 1993

V. Soundararajan and S. Devaraj Arumainayagam

Presents a compact table yielding the parameters of a single sampling scheme. The table is compatible with the structure of MIL‐STD‐105D and the switching procedure incorporated…

Abstract

Presents a compact table yielding the parameters of a single sampling scheme. The table is compatible with the structure of MIL‐STD‐105D and the switching procedure incorporated in this scheme is relatively simpler than that of MIL‐STD‐105D. The basis for the construction of the table is given. Methods are given for the selection of a scheme having either acceptable quality level, limiting quality level, indifference quality level or average outgoing quality limit as a function of lot size.

Details

International Journal of Quality & Reliability Management, vol. 10 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 23 May 2008

D.R. Prajapati and P.B. Mahapatra

The purpose of this paper is to introduce a new design of the chart to catch smaller shifts in the process average as well as to maintain the simplicity like the Shewhart

Abstract

Purpose

The purpose of this paper is to introduce a new design of the chart to catch smaller shifts in the process average as well as to maintain the simplicity like the Shewhart chart so that it may be applied at shopfloor level.

Design/methodology/approach

In this paper, a new chart with two strategies is proposed which can overcome the limitations of Shewhart, CUSUM and EWMA charts. The Shewhart chart uses only two control limits to arrive at a decision to accept the Null Hypothesis (H0) or Alternative Hypothesis (H1), but in the new chart, two more limits at “K” times sample standard deviation on both sides from center line have been introduced. These limits are termed warning limits. The first strategy is based on chi‐square distribution (CSQ), while the second strategy is based on the average of sample means (ASM).

Findings

The proposed chart with “strategy ASM” shows lower average run length (ARL) values than ARLs of variable parameter (VP) chart for most of the cases. The VP chart shows little better performance than the new chart; but at large sample sizes (n) of 12 and 16. The VSS chart also shows lower ARLs but at very large sample size, which should not be used because, as far as possible, samples should be taken from a lot produced under identical conditions. The inherent feature of the new chart is its simplicity, so that it can be used without difficulty at shopfloor level as it uses only a fixed sample size and fixed sampling interval but it is very difficult to set the various chart parameters in VP and VSS charts.

Research limitations/implications

A lot of effort has been expended to develop the new strategies for monitoring the process mean. Various assumptions and factors affecting the performance of the chart have been identified and taken into account. In the proposed design, the observations have been assumed independent of one another but the observations may also be assumed to be auto‐correlated with previous observations and performance of the proposed chart may be studied.

Originality/value

The research findings could be applied to various manufacturing and service industries as it is more effective than the Shewhart chart and simpler than the VP, VSS and CUSUM charts.

Details

International Journal of Quality & Reliability Management, vol. 25 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

Book part
Publication date: 24 October 2013

Jinyong Kim and Yong-Cheol Kim

U.S. bank holding companies (BHCs) have experienced dynamic changes over a period of 2000–2010. We find that the size distribution of sample banks becomes highly positively skewed…

Abstract

U.S. bank holding companies (BHCs) have experienced dynamic changes over a period of 2000–2010. We find that the size distribution of sample banks becomes highly positively skewed with a small number of big banks becoming super-sized, and these big banks tend to take extra risk by holding derivative positions for trading purposes. The ten largest risk-taking banks hold about 70% of total assets of all the sample banks in 2010. We investigate whether the risk-taking activities of the BHCs translate into higher risk-adjusted return performance. In extensive panel regression analyses, we find that the risk-taking strategies of large banks by holding derivative positions for trading purpose do not show the clear evidence of enhancing risk-adjusted performance. We find that negative impacts of extra risk-taking on the risk-adjusted performance become bigger with the size of banks.

Details

Global Banking, Financial Markets and Crises
Type: Book
ISBN: 978-1-78350-170-0

Keywords

Article
Publication date: 4 September 2023

Pankaj Vishwakarma and Malaya Ranjan Mohapatra

Understanding consumer behavior across various contexts within marketing has long been the focus of studies. Although many models are used in explaining consumers' behavior, one…

Abstract

Purpose

Understanding consumer behavior across various contexts within marketing has long been the focus of studies. Although many models are used in explaining consumers' behavior, one of these is the Model of Goal-Directed Behavior (MGB), which is becoming prominent in explaining consumers' behavior in marketing. Given its popularity, prior research on MGB has shown inconsistent outcomes regarding the casual association of MGB variables. To overcome this, the authors have adopted a meta-analytic review of the marketing studies grounded on MGB theory in examining the consumers' behavior.

Design/methodology/approach

The study reviewed and analyzed 611 correlations from 27 studies with 31 samples (combined sample size of 9588) using a meta-analytic structural equation modeling (MASEM) technique.

Findings

The outcomes of MASEM confirm the significance of all the proposed relationships in the MGB model. However, the attitude has shown a strong influence on desire formation among all the proposed MGB relationships. Further, past buying experience and positive anticipated emotions strongly affect desire in developed nations compared to developing nations.

Research limitations/implications

The current work has considered the possibility of various recommended moderators (e.g. culture, crisis situation, sample size, method of data collection, etc.); however, the study lacks to consider the dimension of gender dominance in it. Hence, future researchers should keep it in mind while conducting similar studies. Future scholars can also perform a comparative study on MGB across the domains and subdomains to know more insights.

Originality/value

The current work offers a better understanding of MGB application in marketing. As this work is one of the first meta-analyses on MGB application in marketing that also considers the effect of various moderators, it thus adds knowledge to the literature on MGB in marketing. It will also help the future researchers to understand MGB as a framework and its application in marketing.

Details

Marketing Intelligence & Planning, vol. 41 no. 8
Type: Research Article
ISSN: 0263-4503

Keywords

Article
Publication date: 1 October 2018

Nataliya Chukhrova and Arne Johannssen

The purpose of this paper is to construct innovative exact and approximative sampling plans for acceptance sampling in statistical quality control. These sampling plans are…

Abstract

Purpose

The purpose of this paper is to construct innovative exact and approximative sampling plans for acceptance sampling in statistical quality control. These sampling plans are determined for crisp and fuzzy formulation of quality limits, various lot sizes and common α- and β-levels.

Design/methodology/approach

The authors use generalized fuzzy hypothesis testing to determine sampling plans with fuzzified quality limits. This test method allows a consideration of the indifference zone related to expert opinion or user priorities. In addition to the exact sampling plans calculated with the hypergeometric operating characteristic function, the authors consider approximative sampling plans using a little known, but excellent operating characteristic function. Further, a comprehensive sensitivity analysis of calculated sampling plans is performed, in order to examine how the inspection effort depends on crisp and fuzzy formulation of quality limits, the lot size and specifications of the producer’s and consumer’s risks.

Findings

The results related the parametric sensitivity analysis of the calculated sampling plans and the conclusions regarding the approximation quality provide the user a comprehensive basis for a direct implementation of the sampling plans in practice.

Originality/value

The constructed sampling plans ensure the simultaneous control of producer’s and consumer’s risks with the smallest possible inspection effort on the one hand and a consideration of expert opinion or user priorities on the other hand.

Details

International Journal of Quality & Reliability Management, vol. 35 no. 9
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 2 April 2019

Nda Muhammad, Mohd Shalahuddin Adnan, Mohd Azlan Mohd Yosuff and Kabiru Abdullahi Ahmad

Sediment measurement is usually accessible on a periodic or distinct basis. The measurement of sediment (suspended and bedload), especially in the field, is vital in keeping…

Abstract

Purpose

Sediment measurement is usually accessible on a periodic or distinct basis. The measurement of sediment (suspended and bedload), especially in the field, is vital in keeping essential data of sediment transport and deposition. Various techniques for measuring sediment have been used over time each with its merits and demerits. The techniques discussed in this paper for suspended sediment include bottle, acoustic, pump, laser diffraction, nuclear and optical. Other techniques for bedload measurement are; River bedload trap (RBT), CSU/FU bedload trap, Helley–Smith, Polish Hydrological Services (PIHM) device, pit and trough, vortex tube, radioactive traces and bedload–surrogate technologies. However, the choice of technique depends on multiple factors ranging from budget constraint, availability of equipment, manpower and data requirement. The purpose of this paper is to present valuable information on selected techniques used in sediment measurement, to aid researchers/practitioners in the choice of sediment measurement technique.

Design/methodology/approach

This paper presents a general review of selected field techniques used in sediment measurement (suspended and bedload). Each techniques mode of operation, merits and demerits are discussed.

Findings

This paper highlights that each technique has its peculiar merits and demerits. However, two techniques are generally preferred over others; the bottle sampling and the Helley–Smith sampler for measuring suspended and bedload sediment. This is because the applicability of these techniques is quite widespread and time-tested.

Originality/value

This review paper provides an in-depth description and comparison of selected existing field sediment measurement techniques. The objective is to ease decision-making about the choice of technique, as well as to identify the suitability and applicability of the chosen technique.

Details

World Journal of Engineering, vol. 16 no. 1
Type: Research Article
ISSN: 1708-5284

Keywords

11 – 20 of over 138000