Search results
1 – 10 of over 1000A statement from Michell (Michell, J., “Normal science, pathological science, and psychometrics”, Theory and Psychology, Vol. 10 No. 5, 2000, pp. 639‐67), “psychometrics is a…
Abstract
A statement from Michell (Michell, J., “Normal science, pathological science, and psychometrics”, Theory and Psychology, Vol. 10 No. 5, 2000, pp. 639‐67), “psychometrics is a pathology of science”, is contrasted with conventional definitions provided by leading texts. The key to understanding why Michell has made such a statement is bound up in the definition of measurement that characterises quantification of variables within the natural sciences. By describing the key features of quantitative measurement, and contrasting these with current psychometric practice, it is argued that Michell is correct in his assertion. Three avenues of investigation would seem to follow from this position, each of which, it is suggested, will gradually replace current psychometric test theory, principles, and properties. The first attempts to construct variables that can be demonstrated empirically to possess a quantitative structure. The second proceeds on the basis of using qualitative (non‐quantitatively structured) variable structures and procedures. The third, applied numerics, is an applied methodology whose sole aim is pragmatic utility; it is similar in some respects to current psychometric procedures except that “test theory” can be discarded in favour of simpler tests of observational reliability and validity. Examples are presented of what future practice may look like in each of these areas. It is to be hoped that psychometrics begins to concern itself more with the logic of its measurement, rather than the ever‐increasing complexity of its numerical and statistical operations.
Details
Keywords
Carl Norwood, Anna Tickle, Danielle De Boos and Roberta Dewa
The involvement of service users within clinical psychology training is written into policy. However, the practice of evaluating involvement from both trainees’ and service users’…
Abstract
Purpose
The involvement of service users within clinical psychology training is written into policy. However, the practice of evaluating involvement from both trainees’ and service users’ viewpoint is minimal. The purpose of this paper is to evaluate recent service user involvement in psychometrics and formulation teaching on a clinical psychology training programme, from both service user and trainee perspectives.
Design/methodology/approach
Focus groups were held with service users (n=3) involved in the teaching, as well as trainees (n=3). Additional questionnaire data were captured from trainees (n=11). Service user and trainee data were analysed separately using thematic analysis. Themes generated for trainees were also mapped on to a competency framework for clinical psychologists.
Findings
Both parties found the teaching beneficial. Service users enjoyed supporting trainees and engaged positively in their roles. They identified relational aspects and reflections on their own therapy as other benefits. Trainees reported enhanced clinical preparedness, critical and personal reflection. Trainee anxiety was evident. Learning mapped well to competency frameworks.
Research limitations/implications
The samples were small and some data truncated. Findings speak to broader issues and may transfer to other involvement contexts.
Practical implications
A good degree of meaningful involvement can be achieved through such initiatives, to mutual benefit and enhanced learning.
Originality/value
Nature of the exercise and dual-aspect approach to evaluation described here helps to minimise tokenism. The mapping of findings to competency frameworks supports evaluative processes and helps to legitimise involvement initiatives that challenge the boundaries of existing practice.
Details
Keywords
To provide an interview with Mark Batey, R&D Director of E‐Metrixx & Joint Chairman of the Psychometrics at Work Research Group at Manchester Business School.
Abstract
Purpose
To provide an interview with Mark Batey, R&D Director of E‐Metrixx & Joint Chairman of the Psychometrics at Work Research Group at Manchester Business School.
Design/methodology/approach
This briefing is prepared by an independent interviewer.
Findings
Dr Batey is an international authority on the Psychology of Creativity. In 2009, he was ranked second in the world for published research into creativity and in 2010, appeared with Lord Robert Winston on BBC's Child of Our Time. He presented at the 2010 HR Conference on the topic of “Addressing the Creativity Crisis,” looking at what we should be doing to develop creativity within ourselves and our organizations.
Practical implications
Provides guidance on how to change the culture of an organization to encourage creativity and how creativity can help businesses to survive turbulent markets.
Originality/value
Dr Batey draws on his experiences as a Psychologist and Chair of the Psychometrics at Work Research Group to offer businesses a new model for working. His research and training covers areas including: creativity, personality, financial behaviour, risk behaviour; attitudes towards fraud, emotional intelligence, and a whole different range of things related to how individuals can be different from one another. Through his interview, Dr Batey highlights how we can use this knowledge to work more effectively.
Details
Keywords
Abstract
Details
Keywords
Sandra I. Cheldelin and Louis A. Foritano
Successful businesses today are attending to internal and externalchanges. Their leaders value diversity and seek a heterogeneousworkforce, recognising that effective work teams…
Abstract
Successful businesses today are attending to internal and external changes. Their leaders value diversity and seek a heterogeneous workforce, recognising that effective work teams are essential in the process. In the shaping and managing of new organisational cultures, consultants can provide assistance. This article describes a Fortune 50 company client where a particular method of team‐building was used that involved the results of a five‐month consultation which included 20 subgroups in eight cities.
Details
Keywords
Thomas Salzberger, Marko Sarstedt and Adamantios Diamantopoulos
This paper aims to critically comment Rossiter’s “How to use C-OAR-SE to design optimal standard measures” in the current issue of EJM and provides a broader perspective on…
Abstract
Purpose
This paper aims to critically comment Rossiter’s “How to use C-OAR-SE to design optimal standard measures” in the current issue of EJM and provides a broader perspective on Rossiter’s C-OAR-SE framework and measurement practice in marketing in general.
Design/methodology/approach
The paper is conceptual, based on interpretation of measurement theory.
Findings
The paper shows that, at best, Rossiter’s mathematical dismissal of convergent validity applies to the completely hypothetical (and highly unlikely) situation where a perfect measure without any error would be available. Further considerations cast serious doubt on the appropriateness of Rossiter’s concrete object, dual subattribute-based single item measures. Being immunized against any piece of empirical evidence, C-OAR-SE cannot be considered a scientific theory and is bound to perpetuate, if not aggravate, the fundamental flaws in current measurement practice. While C-OAR-SE indeed helps generate more content valid instruments, the procedure offers no insights as to whether these instruments work properly to be used in research and practice.
Practical implications
This paper concludes that great caution needs to be exercised before adapting measurement instruments based on the C-OAR-SE procedure, and statistical evidence remains essential for validity assessment.
Originality/value
This paper identifies several serious conceptual and operational problems in Rossiter’s C-OAR-SE procedure and discusses how to align measurement in the social sciences to be compatible with the definition of measurement in the physical sciences.
Details