Search results

1 – 10 of over 2000
Article
Publication date: 31 March 2023

Pei-Chi Kelly Hsiao, Mary Low and Tom Scott

This paper aims to examine the extent to which performance indicators (PIs) reported by New Zealand (NZ) higher education institutions (HEIs) correspond with accounting standards…

Abstract

Purpose

This paper aims to examine the extent to which performance indicators (PIs) reported by New Zealand (NZ) higher education institutions (HEIs) correspond with accounting standards and guidance and the effects issuance of principles-based authoritative guidance and early adoption of Public Benefit Entity Financial Reporting Standard 48 (PBE FRS 48) have on the PIs disclosed.

Design/methodology/approach

Using a content analysis index derived from accounting standards and guidance, we conduct a longitudinal assessment of the 2016 and 2019 statements of service performance published by 22 NZ HEIs.

Findings

The PIs reported extend beyond the service performance elements proposed by standard-setters. Despite few indicators on intermediate and broader outcomes, the measures disclosed by HEIs are reflective of their role in the NZ economy and the national Tertiary Education Strategy. The results show that principles-based authoritative guidance and early adoption of PBE FRS 48 influence the focus and type of measures disclosed, while there is no evidence of improvements in the reporting of impacts, outcomes and information useful for performance evaluation.

Practical implications

This paper provides timely insights for standard-setters and regulators on the influence principles-based accounting standards and guidance have on non-financial reporting practices.

Originality/value

This study contributes to the scant literature on HEIs’ service performance reporting. It presents a model for conceptualising HEIs’ PIs that can be used as a basis for future research on non-financial reporting. It also reflects on the tension between accountability and “accountingisation”, suggesting that, although the PIs reported support formal accountability, they do not communicate whether HEIs’ activities and outputs meet their social purpose.

Article
Publication date: 28 March 2024

Elisa Gonzalez Santacruz, David Romero, Julieta Noguez and Thorsten Wuest

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework…

Abstract

Purpose

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework (IQ4.0F) for quality improvement (QI) based on Six Sigma and machine learning (ML) techniques towards ZDM. The IQ4.0F aims to contribute to the advancement of defect prediction approaches in diverse manufacturing processes. Furthermore, the work enables a comprehensive analysis of process variables influencing product quality with emphasis on the use of supervised and unsupervised ML techniques in Six Sigma’s DMAIC (Define, Measure, Analyze, Improve and Control) cycle stage of “Analyze.”

Design/methodology/approach

The research methodology employed a systematic literature review (SLR) based on PRISMA guidelines to develop the integrated framework, followed by a real industrial case study set in the automotive industry to fulfill the objectives of verifying and validating the proposed IQ4.0F with primary data.

Findings

This research work demonstrates the value of a “stepwise framework” to facilitate a shift from conventional quality management systems (QMSs) to QMSs 4.0. It uses the IDEF0 modeling methodology and Six Sigma’s DMAIC cycle to structure the steps to be followed to adopt the Quality 4.0 paradigm for QI. It also proves the worth of integrating Six Sigma and ML techniques into the “Analyze” stage of the DMAIC cycle for improving defect prediction in manufacturing processes and supporting problem-solving activities for quality managers.

Originality/value

This research paper introduces a first-of-its-kind Quality 4.0 framework – the IQ4.0F. Each step of the IQ4.0F was verified and validated in an original industrial case study set in the automotive industry. It is the first Quality 4.0 framework, according to the SLR conducted, to utilize the principal component analysis technique as a substitute for “Screening Design” in the Design of Experiments phase and K-means clustering technique for multivariable analysis, identifying process parameters that significantly impact product quality. The proposed IQ4.0F not only empowers decision-makers with the knowledge to launch a Quality 4.0 initiative but also provides quality managers with a systematic problem-solving methodology for quality improvement.

Details

The TQM Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1754-2731

Keywords

Open Access
Article
Publication date: 6 February 2024

Abdelmoneim Bahyeldin Mohamed Metwally and Ahmed Diab

In developing countries, how risk management technologies influence management accounting and control (MAC) practices is under-researched. By drawing on insights from…

Abstract

Purpose

In developing countries, how risk management technologies influence management accounting and control (MAC) practices is under-researched. By drawing on insights from institutional studies, this study aims to examine the multiple institutional pressures surrounding an entity and influencing its risk-based management control (RBC) system – that is, how RBC appears in an emerging market attributed to institutional multiplicity.

Design/methodology/approach

The authors used qualitative case study research methods to collect empirical evidence from a privately owned Egyptian insurance company.

Findings

The authors observed that in the transformation to risk-based controls, especially in socio-political settings such as Egypt, changes in MAC systems were consistent with the shifts in the institutional context. Along with changes in the institutional environment, the case company sought to configure its MAC system to be more risk-based to achieve its strategic goals effectively and maintain its sustainability.

Originality/value

This research provides a fuller view of risk-based management controls based on the social, professional and political perspectives central to the examined institutional environment. Moreover, unlike early studies that reported resistance to RBC, this case reveals the institutional dynamics contributing to the successful implementation of RBC in an emerging market.

Details

Qualitative Research in Accounting & Management, vol. 21 no. 2
Type: Research Article
ISSN: 1176-6093

Keywords

Open Access
Article
Publication date: 27 March 2023

Michael Adu Kwarteng, Alex Ntsiful, Christian Nedu Osakwe and Kwame Simpe Ofori

This study proposes and validates an integrated theoretical model involving the theory of planned behavior (TPB), health belief model (HBM), personal norms and information privacy…

1199

Abstract

Purpose

This study proposes and validates an integrated theoretical model involving the theory of planned behavior (TPB), health belief model (HBM), personal norms and information privacy to understand determinants of acceptance and resistance to the use of mobile contact tracing app (MCTA) in a pandemic situation.

Design/methodology/approach

This study draws on online surveys of 194 research respondents and uses partial least squares structural equation modeling (PL-SEM) to test the proposed theoretical model.

Findings

The study establishes that a positive attitude towards MCTA is the most important predictor of individuals' willingness to use MCTA and resistance to use MCTA. Furthermore, barriers to taking action positively influence resistance to the use of MCTA. Personal norms negatively influence resistance to the use of MCTA. Information privacy showed a negative and positive influence on willingness to use MCTA and use the resistance of MCTA, respectively, but neither was statistically significant. The authors found no significant influence of perceived vulnerability, severity, subjective norms and perceived behavioral control on either acceptance or use resistance of MCTA.

Originality/value

The study has been one of the first in the literature to propose an integrated theoretical model in the investigation of the determinants of acceptance and resistance to the use of MCTA in a single study, thereby increasing the scientific understanding of the factors that can facilitate or inhibit individuals from engaging in the use of a protection technology during a pandemic situation.

Peer review

The peer review history for this article is available at: https://publons.com/publon/10.1108/OIR-10-2021-0533

Details

Online Information Review, vol. 48 no. 1
Type: Research Article
ISSN: 1468-4527

Keywords

Article
Publication date: 25 December 2023

Zihan Dang and Naiming Xie

Assembly line is a common production form and has been effectively used in many industries, but the imprecise processing time of each process makes production line balancing and…

Abstract

Purpose

Assembly line is a common production form and has been effectively used in many industries, but the imprecise processing time of each process makes production line balancing and capacity forecasting the most troublesome problems for production managers. In this paper, uncertain man-hours are represented as interval grey numbers, and the optimization problem of production line balance in the case of interval grey man-hours is studied to better evaluate the production line capacity.

Design/methodology/approach

First, this paper constructs the basic model of assembly line balance optimization for the single-product scenario, and on this basis constructs an assembly line balance optimization model under the multi-product scenario with the objective function of maximizing the weighted greyscale production line balance rate, second, this paper designs a simulated annealing algorithm to solve problem. A neighborhood search strategy is proposed, based on assembly line balance optimization, an assembly line capacity evaluation method with interval grey man-hour characteristics is designed.

Findings

This paper provides a production line balance optimization scheme with uncertain processing time for multi-product scenarios and designs a capacity evaluation method to provide managers with scientific management strategies so that decision-makers can scientifically solve the problems that the company's design production line is quite different from the actual production situation.

Originality/value

There are few literary studies on combining interval grey number with assembly line balance optimization. Therefore, this paper makes an important contribution in this regard.

Details

Grey Systems: Theory and Application, vol. 14 no. 2
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 22 March 2024

Ravichandran Joghee and Reesa Varghese

The purpose of this article is to study the link between mean shift and inflation coefficient when the underlying null hypothesis is rejected in the analysis of variance (ANOVA…

Abstract

Purpose

The purpose of this article is to study the link between mean shift and inflation coefficient when the underlying null hypothesis is rejected in the analysis of variance (ANOVA) application after the preliminary test on the model specification.

Design/methodology/approach

A new approach is proposed to study the link between mean shift and inflation coefficient when the underlying null hypothesis is rejected in the ANOVA application. First, we determine this relationship from the general perspective of Six Sigma methodology under the normality assumption. Then, the approach is extended to a balanced two-stage nested design with a random effects model in which a preliminary test is used to fix the main test statistic.

Findings

The features of mean-shifted and inflated (but centred) processes with the same specification limits from the perspective of Six Sigma are studied. The shift and inflation coefficients are derived for the two-stage balanced ANOVA model. We obtained good predictions for the process shift, given the inflation coefficient, which has been demonstrated using numerical results and applied to case studies. It is understood that the proposed method may be used as a tool to obtain an efficient variance estimator under mean shift.

Research limitations/implications

In this work, as a new research approach, we studied the link between mean shift and inflation coefficients when the underlying null hypothesis is rejected in the ANOVA. Derivations for these coefficients are presented. The results when the null hypothesis is accepted are also studied. This needs the help of preliminary tests to decide on the model assumptions, and hence the researchers are expected to be familiar with the application of preliminary tests.

Practical implications

After studying the proposed approach with extensive numerical results, we have provided two practical examples that demonstrate the significance of the approach for real-time practitioners. The practitioners are expected to take additional care before deciding on the model assumptions by applying preliminary tests.

Originality/value

The proposed approach is original in the sense that there have been no similar approaches existing in the literature that combine Six Sigma and preliminary tests in ANOVA applications.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 5 April 2024

Fangqi Hong, Pengfei Wei and Michael Beer

Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and…

Abstract

Purpose

Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and alternative acquisition functions, such as the Posterior Variance Contribution (PVC) function, have been developed for adaptive experiment design of the integration points. However, those sequential design strategies also prevent BC from being implemented in a parallel scheme. Therefore, this paper aims at developing a parallelized adaptive BC method to further improve the computational efficiency.

Design/methodology/approach

By theoretically examining the multimodal behavior of the PVC function, it is concluded that the multiple local maxima all have important contribution to the integration accuracy as can be selected as design points, providing a practical way for parallelization of the adaptive BC. Inspired by the above finding, four multimodal optimization algorithms, including one newly developed in this work, are then introduced for finding multiple local maxima of the PVC function in one run, and further for parallel implementation of the adaptive BC.

Findings

The superiority of the parallel schemes and the performance of the four multimodal optimization algorithms are then demonstrated and compared with the k-means clustering method by using two numerical benchmarks and two engineering examples.

Originality/value

Multimodal behavior of acquisition function for BC is comprehensively investigated. All the local maxima of the acquisition function contribute to adaptive BC accuracy. Parallelization of adaptive BC is realized with four multimodal optimization methods.

Details

Engineering Computations, vol. 41 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Open Access
Article
Publication date: 12 December 2023

Robert Mwanyepedza and Syden Mishi

The study aims to estimate the short- and long-run effects of monetary policy on residential property prices in South Africa. Over the past decades, there has been a monetary…

Abstract

Purpose

The study aims to estimate the short- and long-run effects of monetary policy on residential property prices in South Africa. Over the past decades, there has been a monetary policy shift, from targeting money supply and exchange rate to inflation. The shifts have affected residential property market dynamics.

Design/methodology/approach

The Johansen cointegration approach was used to estimate the effects of changes in monetary policy proxies on residential property prices using quarterly data from 1980 to 2022.

Findings

Mortgage finance and economic growth have a significant positive long-run effect on residential property prices. The consumer price index, the inflation targeting framework, interest rates and exchange rates have a significant negative long-run effect on residential property prices. The Granger causality test has depicted that exchange rate significantly influences residential property prices in the short run, and interest rates, inflation targeting framework, gross domestic product, money supply consumer price index and exchange rate can quickly return to equilibrium when they are in disequilibrium.

Originality/value

There are limited arguments whether the inflation targeting monetary policy framework in South Africa has prevented residential property market boom and bust scenarios. The study has found that the implementation of inflation targeting framework has successfully reduced booms in residential property prices in South Africa.

Details

International Journal of Housing Markets and Analysis, vol. 17 no. 7
Type: Research Article
ISSN: 1753-8270

Keywords

Article
Publication date: 7 March 2023

Jiju Antony, Laynes Lauterbach, Elisabeth Viles, Martin Tanco, Sandy Furterer and Ronald D. Snee

This article presents a novel case study that analyzes the applicability of DoE in the curling sport in order to improve their own performance and the performance of its athletes…

Abstract

Purpose

This article presents a novel case study that analyzes the applicability of DoE in the curling sport in order to improve their own performance and the performance of its athletes. Specifically, this study analyzes the most important factors to increase accuracy and precision in the draw game with curlers' opinions. It was decided to use the “Last Stone Draw (LSD)’ as an appropriate play situation.

Design/methodology/approach

Specifically, this study analyzes most important factors to increase accuracy and precision in the draw game with curlers opinions from the German Curling association. Three research techniques were used in this study: case study, interviews and a well-designed experiment. The analysis through the use of DoE includes a measurement system analysis, an initial variance test between two players, a screening and a characterization experiment.

Findings

The results obtained from DoE suggest that the factors routine, stress, release, balance, and the previous play situation have a substantial impact on the score of the player's draw game. However, no factor has a statistically significant impact on the average distance to the center of the target. Moreover, the DoE analysis also concludes that the accuracy and precision of the player's performance is not affected equally by all analyzed factors, but they turn into highly significant when examining their relationship to the other factors.

Practical implications

The findings of this study can be beneficial to other sports events in improving the performance. Moreover, DoE has proved to be an invaluable tool for many people in the German Curling Association in understanding the factors which influence the curlers performance and also factors which do not affect the curlers performance.

Originality/value

This research attempts to contribute to the existing sports management literature by identifying a way in which DoE can be an effective tool in non-manufacturing settings for identification of most important factors which influence the curling performance.

Details

The TQM Journal, vol. 36 no. 2
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 3 July 2023

James L. Sullivan, David Novak, Eric Hernandez and Nick Van Den Berg

This paper introduces a novel quality measure, the percent-within-distribution, or PWD, for acceptance and payment in a quality control/quality assurance (QC/QA) performance…

Abstract

Purpose

This paper introduces a novel quality measure, the percent-within-distribution, or PWD, for acceptance and payment in a quality control/quality assurance (QC/QA) performance specification (PS).

Design/methodology/approach

The new quality measure takes any sample size or distribution and uses a Bayesian updating process to re-estimate parameters of a design distribution as sample observations are fed through the algorithm. This methodology can be employed in a wide range of applications, but the authors demonstrate the use of the measure for a QC/QA PS with upper and lower bounds on 28-day compressive strength of in-place concrete for bridge decks.

Findings

The authors demonstrate the use of this new quality measure to illustrate how it addresses the shortcomings of the percent-within-limits (PWL), which is the current industry standard quality measure. The authors then use the PWD to develop initial pay factors through simulation regimes. The PWD is shown to function better than the PWL with realistic sample lots simulated to represent a variety of industry responses to a new QC/QA PS.

Originality/value

The analytical contribution of this work is the introduction of the new quality measure. However, the practical and managerial contributions of this work are of equal significance.

Details

International Journal of Quality & Reliability Management, vol. 41 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

1 – 10 of over 2000