Search results

1 – 10 of over 103000
Article
Publication date: 13 June 2016

Jin Zhang, Yuehua Zhao and Yanyan Wang

Quantitative methods, especially statistical methods, play an increasingly important role in research of library and information science (LIS). For different journals, the uses of…

1450

Abstract

Purpose

Quantitative methods, especially statistical methods, play an increasingly important role in research of library and information science (LIS). For different journals, the uses of statistical methods vary substantially due to different journal scopes and aims. The purpose of this paper is to explore the characteristics of statistical methodology uses in six major scholarly journals in LIS.

Design/methodology/approach

Research papers that used statistical methods from the six major journals were selected and investigated. Content analysis method, descriptive statistical analysis method, and temporal analysis method were used to compare and analyze statistical method uses in research papers of the investigated journals.

Findings

The findings of this study show that there was a clear growth trend of statistical method uses in five of the investigated journals; statistical methods were used most in The Journal of the Association for Information Science and Technology and Information Processing & Management; and the top three most frequently used statistical methods were t-test, ANOVA test, and χ2-test.

Originality/value

The findings can be used to better understand the application areas, patterns, and trends of statistical methods among the investigated journals and their statistical methodology orientations in research studies of LIS.

Details

Online Information Review, vol. 40 no. 3
Type: Research Article
ISSN: 1468-4527

Keywords

Book part
Publication date: 18 April 2015

Amanar Akhabbar

This chapter provides a presentation about Chapter 1 of The Balance of the National Economy, 192324, edited by Pavel Illich Popov. The Balance was issued in June 1926 by the…

Abstract

This chapter provides a presentation about Chapter 1 of The Balance of the National Economy, 192324, edited by Pavel Illich Popov. The Balance was issued in June 1926 by the Central Statistical Administration (CSU or TsSU) of the USSR, which Popov had headed from July 1918 to January 1926. In the first part of our chapter, we show how Popov’s work on the balance of the national economy was rooted in the specific scientific and political culture of zemstvo statisticians inherited from the Tsar. Statistical inquiry was considered an objective scientific process based on international standards. Furthermore, like zemstvo statisticians, CSU statisticians developed great autonomous political power. The balance of the national economy was built according to these principles, which met harsh criticism from revolutionaries and Bolsheviks. In the second part, we analyze the contents of Popov’s Chapter 1, especially the theoretical foundations of the balance and its connection with Soviet planning. In the third part, we discuss the balance’s significance in the years 1926–1929, years which ended the NEP and launched the first Five-Year Plan, so as to understand how CSU’s balance didn’t become a standard Soviet statistical instrument and was discarded as a “bourgeois” device.

Article
Publication date: 1 June 1998

Jiju Antony Mike Kaye and Andreas Frangou

It is widely considered that the advanced statistical quality improvement techniques (ASQIT) such as design of experiments and Taguchi methods form an essential part of the search…

1850

Abstract

It is widely considered that the advanced statistical quality improvement techniques (ASQIT) such as design of experiments and Taguchi methods form an essential part of the search for effective quality control. These quality improvement techniques are well established methodologies in which statisticians are formally trained. Research has shown that the application of these ASQIT for solving process quality problems by industrial engineers in manufacturing companies are limited and when applied they are often performed incorrectly. Presents a strategic and practical methodology with the aim of assisting industrial engineers to tackle manufacturing quality problems in a systematic and structured manner. The potential benefit of this practical methodology is its simplicity in usage and it is therefore readily accessible to the engineering fraternity for solving quality problems in real life situations. Also highlights the results of five industrial case studies which were used to validate and refine the methodology.

Details

The TQM Magazine, vol. 10 no. 3
Type: Research Article
ISSN: 0954-478X

Keywords

Article
Publication date: 1 September 1988

T.N. Goh

Efficient techniques of information collection and analysis are essential to all quality and productivity improvement studies. Most established concepts of quality control are…

Abstract

Efficient techniques of information collection and analysis are essential to all quality and productivity improvement studies. Most established concepts of quality control are passive in nature, intended more for the maintenance of the status quo than for purposeful changes. Statistical experiment design methodologies constitute an active approach which can provide the kind of understanding of process and product characteristics that is needed for managing changes during design and manufacture. Systematic planning of data collection and analysis by these methodologies is a prerequisite for the attainment of higher productivity, as it enables the investigator to identify and evaluate important variables quickly, replacing the conventional single‐variable procedures by a far more efficient approach. The major features and potential applications of experiment design are outlined in a non‐technical language for the appreciation of managers.

Details

Industrial Management & Data Systems, vol. 88 no. 9/10
Type: Research Article
ISSN: 0263-5577

Keywords

Open Access
Article
Publication date: 25 April 2023

Manuela Cazzaro and Paola Maddalena Chiodini

Although the Net Promoter Score (NPS) index is simple, NPS has weaknesses that make NPS's interpretation misleading. The main criticism is that identical index values can…

1263

Abstract

Purpose

Although the Net Promoter Score (NPS) index is simple, NPS has weaknesses that make NPS's interpretation misleading. The main criticism is that identical index values can correspond to different levels of customer loyalty. This makes difficult to determine whether the company is improving/deteriorating in two different years. The authors describe the application of statistical tools to establish whether identical values may/may not be considered similar under statistical hypotheses.

Design/methodology/approach

Equal NPSs with a “similar” component composition should have a two-way table satisfying marginal homogeneity hypothesis. The authors compare the marginals using a cumulative marginal logit model that assumes a proportional odds structure: the model has the same effect for each logit. Marginal homogeneity corresponds to null effect. If the marginal homogeneity hypothesis is rejected, the cumulative odds ratio becomes a tool for measuring the proportionality between the odds.

Findings

The authors propose an algorithm that helps managers in their decision-making process. The authors' methodology provides a statistical tool to recognize customer base compositions. The authors suggest a statistical test of the marginal distribution homogeneity of the table representing the index compositions at two times. Through the calculation of cumulative odds ratios, the authors discriminate against the hypothesis of equality of the NPS.

Originality/value

The authors' contribution provides a statistical alternative that can be easily implemented by business operators to fill the known shortcomings of the index in the customer satisfaction's context. This paper confirms that although a single number summarizes and communicates a complex situation very quickly, the number is ambiguous and unreliable if not accompanied by other tools.

Details

The TQM Journal, vol. 35 no. 9
Type: Research Article
ISSN: 1754-2731

Keywords

Book part
Publication date: 10 June 2009

Andreas Schwab and William H. Starbuck

Null-hypothesis significance tests (NHST) are a very troublesome methodology that dominates the quantitative empirical research in strategy and management. Inherent limitations…

Abstract

Null-hypothesis significance tests (NHST) are a very troublesome methodology that dominates the quantitative empirical research in strategy and management. Inherent limitations and inappropriate applications of NHST impede the accumulation of knowledge and fill academic journals with meaningless “findings,” and they corrode researchers' motivation and ethics. Inherent limitations of NHST include the use of point null hypotheses, meaningless null hypotheses, and dichotomous truth criteria. Misunderstanding of NHST has often led to applications to inappropriate data and misinterpretation of results.

Researchers should move beyond the ritualistic and often inappropriate use of NHST. The chapter does not advocate a best way to do research, but suggests that researchers need to adapt their methods to reflect specific contexts and to use evaluation criteria that are meaningful for those contexts. Researchers need to explain the rationales that guided the selection of evaluation measures and they should avoid excessively complex models with many variables. The chapter also offers four more focused recommendations: (1) Compare proposed hypotheses with naïve hypotheses or the outcomes of alternative treatments. (2) Acknowledge the uncertainty that attends research findings by stating confidence limits for parameter estimates. (3) Show the substantive relevance of findings by reporting effect sizes – preferably with confidence limits. (4) Use statistical methods that are robust against deviations from assumptions about population distributions and the representativeness of samples.

Details

Research Methodology in Strategy and Management
Type: Book
ISBN: 978-1-84855-159-6

Article
Publication date: 20 March 2009

Joanne S. Utley and J. Gaylord May

The purpose of this paper is to devise a robust statistical process control methodology that will enable service managers to better monitor the performance of correlated service…

1707

Abstract

Purpose

The purpose of this paper is to devise a robust statistical process control methodology that will enable service managers to better monitor the performance of correlated service measures.

Design/methodology/approach

A residuals control chart methodology based on least absolute value regression (LAV) is developed and its performance is compared to a traditional control chart methodology that is based on ordinary least squares (OLS) regression. Sensitivity analysis from the goal programming formulation of the LAV model is also performed. The methodology is applied in an actual service setting.

Findings

The LAV based residuals control chart outperformed the OLS based residuals control chart in identifying out of control observations. The LAV methodology was also less sensitive to outliers than the OLS approach.

Research limitations/implications

The findings from this study suggest that the proposed LAV based approach is a more robust statistical process control method than the OLS approach. In addition, the goal program formulation of the LAV regression model permits sensitivity analysis whereas the OLS approach does not.

Practical implications

This paper shows that compared to the traditional OLS based control chart, the LAV based residuals chart may be better suited to actual service settings in which normality requirements are not met and the amount of data is limited.

Originality/value

This paper is the first study to use a least absolute value regression model to develop a residuals control chart for monitoring service data. The proposed LAV methodology can help service managers to do a better job monitoring related performance metrics as part of a quality improvement program such as six sigma.

Details

Managing Service Quality: An International Journal, vol. 19 no. 2
Type: Research Article
ISSN: 0960-4529

Keywords

Article
Publication date: 11 April 2016

Hugo Entradas Silva and Fernando M.A. Henriques

The purpose of this paper is to verify the applicability and efficiency of two statistical methods to obtain sustainable targets of temperature and relative humidity in historic…

Abstract

Purpose

The purpose of this paper is to verify the applicability and efficiency of two statistical methods to obtain sustainable targets of temperature and relative humidity in historic buildings located in temperate climates.

Design/methodology/approach

The data recorded along one year in a non-heated historic building in Lisbon (Portugal) was analysed with the two methodologies, EN 15757 and FCT-UNL. To evaluate their adequacy it was calculated the performance index for each target and it was verified the mechanical and biological degradation risks.

Findings

While the use of the two approaches is suitable for temperate climates, there is a higher efficiency of the FCT-UNL methodology, allowing a better response for the three parameters in evaluation.

Research limitations/implications

Despite the better results obtained, the FCT-UNL methodology was only tested for one city; therefore the application to other situations may be required to obtain more robust conclusions.

Practical implications

The effectiveness of the FCT-UNL methodology to obtain sustainable climate targets can lead to important energy conservation in historic buildings and to contribute for the change of old approaches in the preventive conservation area.

Originality/value

This paper provides a comparison between two recent methods. The results can lead to some advances in the science of preventive conservation, interesting to conservators and building physic scientists.

Details

Structural Survey, vol. 34 no. 1
Type: Research Article
ISSN: 0263-080X

Keywords

Article
Publication date: 2 February 2015

Ahmad Sarfaraz, Kouroush Jenab and Andrew Bowker

The purpose of this paper is to examine the need for a statistical approach in the development of personnel aspiring for a technical manager/technical team leader position in…

Abstract

Purpose

The purpose of this paper is to examine the need for a statistical approach in the development of personnel aspiring for a technical manager/technical team leader position in order to increase corporate profitability. It outlines the details of management training for managerial positions by chronicling the research of thirty academic studies in management strategy as well as real world experiences, which provides a statistical viewpoint for the development of a technical manager/technical team leader as a significant contributor to profitability within the corporate landscape.

Design/methodology/approach

This study begins by validating the strategic management model (Process 1), which states that managerial influences in the organization are important to consider and greatly affect the profitability of a corporation. Statistical methodologies are introduced as tools for the analysis in the development of the technical management/technical team leader position. Using interval-based analytical hierarchy process (i-AHP), the beginning of the process to develop a manager starts with Process 2 at their initial position, developing an employee using personnel management techniques and statistically measuring motivation and commitment. After the employee has demonstrated their abilities and gained knowledge of the people and processes, another assessment is conducted to enter the employee into a development position; Process 3. Process 3 considers the key metrics which will be necessary to allow the employee to develop corporate advantages. Process 4 shows the critical concepts (Process 1 and Process 2) that managers must consider when taking a technical manager/technical team leader role, including personnel development, knowledge management, and project management.

Findings

Corporate profitability is profoundly dependent on the development of employees throughout their careers. The profitability achieved within a corporation landscape can be evaluated and improved through the proposed processes. These processes not only can improve the ability for a newly appointed technical manager/technical team leader to overcome obstacles and navigate difficult situations but also increase the chances for: developing employees to increase corporate profitability, increase corporate profitability themselves, and to develop as a future manager who will be significantly increasing corporate profitability.

Originality/value

This study proposed the statistical processes to develop a technical manager/technical team leader candidate by creating a link between the initiation of the employee’s career, their future job positioning, team working skills, leadership attributes, development of technical management/technical team leading skills, and later management skills to increase corporate profitability.

Details

Benchmarking: An International Journal, vol. 22 no. 1
Type: Research Article
ISSN: 1463-5771

Keywords

Article
Publication date: 1 June 2012

Ali Quazi and Alice Richardson

This purpose of this paper is to identify the possible sources of variation of results in prior studies linking corporate social responsibility (CSR) with corporate financial…

2359

Abstract

Purpose

This purpose of this paper is to identify the possible sources of variation of results in prior studies linking corporate social responsibility (CSR) with corporate financial performance (CFP).

Design/methodology/approach

A meta‐analysis was performed on 51 prior studies included in Orlitzky et al. in order to ensure compatibility with previous results. The meta‐analysis is based on sub‐groups of papers in five‐year time intervals focusing on sample size and methodology employed as the sources of variation concerning the nexus between CSR and CFP.

Findings

The major finding of the study is that sample size and methodology are significant sources of variation in measuring the link between CSR and CFP.

Research limitations/implications

The findings are likely to help develop a structural framework towards broadening and deepening our understanding of the debate regarding the sources of variation in the measurement of CSR and CFP link. This research is limited to papers published up to 1999 as included in Orlitzky et al. Future research can update the findings by using data beyond 1999.

Originality/value

This paper can be considered an advance on the previous research as it contributes to broadening our understanding of the possible source of causes of variation in results of studies linking CSR with CFP.

Details

Social Responsibility Journal, vol. 8 no. 2
Type: Research Article
ISSN: 1747-1117

Keywords

1 – 10 of over 103000