Search results

1 – 10 of 929
To view the access options for this content please click here
Article
Publication date: 1 August 2003

Paul Barrett

A statement from Michell (Michell, J., “Normal science, pathological science, and psychometrics”, Theory and Psychology, Vol. 10 No. 5, 2000, pp. 639‐67), “psychometrics

Downloads
4120

Abstract

A statement from Michell (Michell, J., “Normal science, pathological science, and psychometrics”, Theory and Psychology, Vol. 10 No. 5, 2000, pp. 639‐67), “psychometrics is a pathology of science”, is contrasted with conventional definitions provided by leading texts. The key to understanding why Michell has made such a statement is bound up in the definition of measurement that characterises quantification of variables within the natural sciences. By describing the key features of quantitative measurement, and contrasting these with current psychometric practice, it is argued that Michell is correct in his assertion. Three avenues of investigation would seem to follow from this position, each of which, it is suggested, will gradually replace current psychometric test theory, principles, and properties. The first attempts to construct variables that can be demonstrated empirically to possess a quantitative structure. The second proceeds on the basis of using qualitative (non‐quantitatively structured) variable structures and procedures. The third, applied numerics, is an applied methodology whose sole aim is pragmatic utility; it is similar in some respects to current psychometric procedures except that “test theory” can be discarded in favour of simpler tests of observational reliability and validity. Examples are presented of what future practice may look like in each of these areas. It is to be hoped that psychometrics begins to concern itself more with the logic of its measurement, rather than the ever‐increasing complexity of its numerical and statistical operations.

Details

Journal of Managerial Psychology, vol. 18 no. 5
Type: Research Article
ISSN: 0268-3946

Keywords

To view the access options for this content please click here
Article
Publication date: 7 August 2019

Carl Norwood, Anna Tickle, Danielle De Boos and Roberta Dewa

The involvement of service users within clinical psychology training is written into policy. However, the practice of evaluating involvement from both trainees’ and…

Abstract

Purpose

The involvement of service users within clinical psychology training is written into policy. However, the practice of evaluating involvement from both trainees’ and service users’ viewpoint is minimal. The purpose of this paper is to evaluate recent service user involvement in psychometrics and formulation teaching on a clinical psychology training programme, from both service user and trainee perspectives.

Design/methodology/approach

Focus groups were held with service users (n=3) involved in the teaching, as well as trainees (n=3). Additional questionnaire data were captured from trainees (n=11). Service user and trainee data were analysed separately using thematic analysis. Themes generated for trainees were also mapped on to a competency framework for clinical psychologists.

Findings

Both parties found the teaching beneficial. Service users enjoyed supporting trainees and engaged positively in their roles. They identified relational aspects and reflections on their own therapy as other benefits. Trainees reported enhanced clinical preparedness, critical and personal reflection. Trainee anxiety was evident. Learning mapped well to competency frameworks.

Research limitations/implications

The samples were small and some data truncated. Findings speak to broader issues and may transfer to other involvement contexts.

Practical implications

A good degree of meaningful involvement can be achieved through such initiatives, to mutual benefit and enhanced learning.

Originality/value

Nature of the exercise and dual-aspect approach to evaluation described here helps to minimise tokenism. The mapping of findings to competency frameworks supports evaluative processes and helps to legitimise involvement initiatives that challenge the boundaries of existing practice.

Details

The Journal of Mental Health Training, Education and Practice, vol. 14 no. 5
Type: Research Article
ISSN: 1755-6228

Keywords

To view the access options for this content please click here
Article
Publication date: 19 July 2011

Interview by Juliet Harrison

To provide an interview with Mark Batey, R&D Director of E‐Metrixx & Joint Chairman of the Psychometrics at Work Research Group at Manchester Business School.

Downloads
546

Abstract

Purpose

To provide an interview with Mark Batey, R&D Director of E‐Metrixx & Joint Chairman of the Psychometrics at Work Research Group at Manchester Business School.

Design/methodology/approach

This briefing is prepared by an independent interviewer.

Findings

Dr Batey is an international authority on the Psychology of Creativity. In 2009, he was ranked second in the world for published research into creativity and in 2010, appeared with Lord Robert Winston on BBC's Child of Our Time. He presented at the 2010 HR Conference on the topic of “Addressing the Creativity Crisis,” looking at what we should be doing to develop creativity within ourselves and our organizations.

Practical implications

Provides guidance on how to change the culture of an organization to encourage creativity and how creativity can help businesses to survive turbulent markets.

Originality/value

Dr Batey draws on his experiences as a Psychologist and Chair of the Psychometrics at Work Research Group to offer businesses a new model for working. His research and training covers areas including: creativity, personality, financial behaviour, risk behaviour; attitudes towards fraud, emotional intelligence, and a whole different range of things related to how individuals can be different from one another. Through his interview, Dr Batey highlights how we can use this knowledge to work more effectively.

Details

Human Resource Management International Digest, vol. 19 no. 5
Type: Research Article
ISSN: 0967-0734

Keywords

Content available
Article
Publication date: 20 March 2009

Downloads
285

Abstract

Details

Human Resource Management International Digest, vol. 17 no. 2
Type: Research Article
ISSN: 0967-0734

Content available
Article
Publication date: 3 July 2007

W. Lord

Downloads
2558

Abstract

Details

Development and Learning in Organizations: An International Journal, vol. 21 no. 4
Type: Research Article
ISSN: 1477-7282

Keywords

Content available
Downloads
523

Abstract

Details

Industrial and Commercial Training, vol. 40 no. 5
Type: Research Article
ISSN: 0019-7858

Content available
Article
Publication date: 13 February 2009

Downloads
261

Abstract

Details

Development and Learning in Organizations: An International Journal, vol. 23 no. 2
Type: Research Article
ISSN: 1477-7282

To view the access options for this content please click here
Article
Publication date: 1 April 1989

Sandra I. Cheldelin and Louis A. Foritano

Successful businesses today are attending to internal and externalchanges. Their leaders value diversity and seek a heterogeneousworkforce, recognising that effective work…

Abstract

Successful businesses today are attending to internal and external changes. Their leaders value diversity and seek a heterogeneous workforce, recognising that effective work teams are essential in the process. In the shaping and managing of new organisational cultures, consultants can provide assistance. This article describes a Fortune 50 company client where a particular method of team‐building was used that involved the results of a five‐month consultation which included 20 subgroups in eight cities.

Details

Journal of Managerial Psychology, vol. 4 no. 4
Type: Research Article
ISSN: 0268-3946

Keywords

To view the access options for this content please click here
Article
Publication date: 14 November 2016

Thomas Salzberger, Marko Sarstedt and Adamantios Diamantopoulos

This paper aims to critically comment Rossiter’s “How to use C-OAR-SE to design optimal standard measures” in the current issue of EJM and provides a broader perspective…

Abstract

Purpose

This paper aims to critically comment Rossiter’s “How to use C-OAR-SE to design optimal standard measures” in the current issue of EJM and provides a broader perspective on Rossiter’s C-OAR-SE framework and measurement practice in marketing in general.

Design/methodology/approach

The paper is conceptual, based on interpretation of measurement theory.

Findings

The paper shows that, at best, Rossiter’s mathematical dismissal of convergent validity applies to the completely hypothetical (and highly unlikely) situation where a perfect measure without any error would be available. Further considerations cast serious doubt on the appropriateness of Rossiter’s concrete object, dual subattribute-based single item measures. Being immunized against any piece of empirical evidence, C-OAR-SE cannot be considered a scientific theory and is bound to perpetuate, if not aggravate, the fundamental flaws in current measurement practice. While C-OAR-SE indeed helps generate more content valid instruments, the procedure offers no insights as to whether these instruments work properly to be used in research and practice.

Practical implications

This paper concludes that great caution needs to be exercised before adapting measurement instruments based on the C-OAR-SE procedure, and statistical evidence remains essential for validity assessment.

Originality/value

This paper identifies several serious conceptual and operational problems in Rossiter’s C-OAR-SE procedure and discusses how to align measurement in the social sciences to be compatible with the definition of measurement in the physical sciences.

Details

European Journal of Marketing, vol. 50 no. 11
Type: Research Article
ISSN: 0309-0566

Keywords

To view the access options for this content please click here
Expert briefing
Publication date: 30 May 2018

'Psychmetrics' usage.

1 – 10 of 929