CitationDownload as .RIS
Emerald Group Publishing Limited
Copyright © 2006, Emerald Group Publishing Limited
Cybernetics and public administration: special issue
Cybernetics and public administration: special issue
Metaphorum focuses on bringing people together who wish to continue the work of Stafford Beer or who are interested in the organizational and social issues that concerned him in his life and work. The group was first convened by Angela Espinosa, Allenna Leonard, Chris Cullen and Joe Truss, in June of 2003 at the Scarborough Campus of Hull University, UK using a group process invented by Stafford Beer called Team Syntegrity. The opening question was “How to progress Stafford Beer’s Legacy/Gift?” It was sponsored by Hull University Business School, UK. Among the initiatives arising from that meeting was a plan to meet again and share the work that we were doing that related to these interests. A non-governmental organisation co-ordinating development of this idea, emerged from one of the infosets, and the name Metaphorum was suggested. Ian Perry developed the web site of this event, summarising the results and illustrating the process (see: www.metaphorum.org).
Metaphorum developed its first “Colloquium on Cybernetics and Government” at the Cybernetics and Systems Thinking Centre of Sunderland University, on 30 April, 1 May 2004, coordinated by Leonie Solomons, Alfredo Moscardini and Angela Espinosa. The second conference happened together with the Cybernetics Society Annual Conference, in London School of Economics, UK, 3 and 4 September 2004. This issue draws together papers from these first two meetings. Alfredo Moscardini, Angela Espinosa, Leonie Solomon and Nick Green coordinated both events.
The general topic of these meetings was governance and the papers include an array of theoretical approaches to issues related to governance as well as applications to analyse structural problems of governance and to illustrate application of these ideas in particular situations like development programs, national accounts, political stability and others. The case studies show situations in contrasting nations, like Colombia, Sri Lanka, Swaziland, Ukraine and UK.
Some theoretical considerations are offered to understand ideas of societal identity and structure, organisational closure, political structures and media design. Illustrative cases applying Beer’s ideas show a diagnosis of a national economic system, design of a peace process and implementation of socio-economic development programs. Open research paths for the re-design and implementation of multi-national programmes are illustrated through analysis of current AIDS policy and structure of programmes. Also addressed are issues like the possibilities that more systemic and cybernetic approaches will offer for improvement of government control philosophy, accountability systems, education bureaucracies, and even decision-making on defence technology.
Moscardini, in his letter to the Editors (Appendix) asks us the question of how we can progress multi-disciplined research projects from and understanding of Nature and social realities, as fractal processes where multiple realities co-exist. We need to shift from guaranteeing the “validity” of models, to agreements about their usefulness to address a social concern. The new research approach should offer sound criteria to choose methodological tools and to critically assess the usefulness of research results or corroborate the value of specific systemic tools to clarify or improve certain types of real situations. It is clear that we still need more integration of systemic research approaches with more traditional scientific approaches and he invites us all to engage in an open debate on the issue.
Leonard looks at the structure and practice of democracy in the west according to cybernetic concepts and finds it vulnerable to distortions, lags, gaps and other manifestations of a lack of requisite variety between valid perspectives of circumstances and the ability to metabolise control on behalf of all relevant stakeholders. She addresses the need for improvement in the design and operation of political systems.
Stokes offers a way to use the notions of identity in the viable system model to mesh the understandings of sociology and cybernetics. Since multiple identities have become more central to the way in which societies view themselves, investigation into the way in which they can be understood using organizational cybernetics will be a fruitful avenue of research.
Hoebeke explores the nature of System Five in Beer’s Viable System Model from the perspective of a reflective practitioner. He draws upon his experience in multi-stakeholder platforms to explore notions of organizational closure and the implications it has for improving governance at the board level.
Sergeyev and Moscardini took a complexity management approach to governance. They apply the viable system model to the problems associated with Ukraine’s transition from a centrally controlled communist economy to a market economy and analysed the structural flaws that resulted in adaptations of barter, corruption and overdue debts. They illustrate how structural problems defined the dynamics of change experienced by the economic system.
Dewhurst addresses some of the paradoxes of optimisation that most non-profit organisations – including public sector ones-suffer these days. He argues that the management philosophy of targets as the essence of control and more awareness of its limitations will limit the damage done by its uncritical acceptance.
Espinosa reports on a project in socio-economic development in Colombia that employed Beer’s ideas for organisational design and measurement systems, to implement a monitoring system for a presidential programme to fight against poverty, that combined the resources of the university, government and development agencies to provide more effective interventions and more appropriate measures to the effort. A shift in focus from a top down to a bottom up approach that took into account local priorities and concerns was a key element in its success.
Villarreal uses cybernetic insights, inspired by the viable system model to propose a more effective approach to the AIDS crisis improving responses from governments and the development community. It is of concern that addressing the many problems created by AIDS in a piecemeal fashion will leave the viability of the affected areas at considerable risk. An approach that blended ameliorating measures in a holistic way would increase the probability of survival of many communities.
Clarke uses concepts from cybernetics and systems to re-conceptualise leadership in the United Nations to respond more effectively to the HIV/AIDS crisis and to change their approach from “doing things right” to “doing the right thing”. “Antidotes” from interviews in Swaziland are proposed as a beginning to help the United Nations move away from an “autistic” inward focus to a more externally focused one.
Solomons and Moscardini apply cybernetic reasoning to the problems associated with the make-up of three different stages of the peace process in Sri Lanka. Using the work of Maturana and Beer, they diagnose deficiencies in requisite variety that will plague future peace talks unless they are addressed. In particular, they discuss the need for the nature and requirements of issues and structures to be addressed at the level of interaction government-governed, before the single unity of Sri Lanka can be expected to succeed.
Thomas celebrates the centenary of W. Ross Ashby, one of the important early pioneers of cybernetics, and his depiction of the law of requisite variety. He applies it to an analysis of policies of UK higher education, particularly the focus on academic standards and the decisions to increase student numbers and modularise curricula. It shows why the current regime produces a bureaucratized university and why the current policy of expanding UK higher education may be flawed.
Turke addresses the preconditions for the design of media interactions that would enable governance using Beer’s Viable System Model and Schwaninger’s Model of Systemic Control to balance images of actors in sociological contexts. Proper modelling of interactions in social systems will better support design principles for media to promote the viability and governance of social systems.
Wright gives a sobering view of the way in which technology, in the service of government and private organizations, has developed increasingly sophisticated means to intrude upon our privacy and manipulate situations toward ends that have not been agreed by any legitimate process.
Hansson examines the shortcomings of the current information collected for national accounts with respect to model building and other investigative and pattern recognition tools. New methods, enabled by advances in technology, are suggested to make the data gathered for national accounts more illuminating and more fruitful as a source of pattern recognition and learning.
Many today believe that there needs to be a new paradigm concerning the nature of the world. The closed, deterministic, analytic model of Newton (sometimes known as the scientific method) is being replaced by a view that takes a holistic approach and uses the power of the computer to examine non-linear models that were hitherto insoluble. Nature is also seen as a fractal process. We must consider what this really means in practice when applied to multi-disciplined research projects:
if the world is unknowable then models are as best only possible explanations of what is really happening;
data are subjective and thus is neither strictly or absolutely true; and
in such cases models cannot predict accurately in any numerical sense.
The process of generating data from a model to compare with existing data is now problematic. If the model is a non-linear one, then the results of any simulation follow an arbitrary path. Thus the results of the model are not guaranteed to mirror what is actually happening (even if we can agree on what is happening). This seems to invalidate the scientific method in such cases. How does this affect or assessment of modern research? What must be avoided is a situation where shoddy work can be passed.
I am suggesting that a rigorous framework should be established and agreed by like-minded academics. As an initial attempt, I suggest the following:
There should be a detailed defence of the choice of methodology selected. Several should be considered and the final selection should be justified.
The examiners can then examine the use made of this methodology in solving the problem. Does the student apply it correctly? Is the logic of his argument correct? Are there mistakes in building the model or simulation?
The model can be tested in a base case to observe if it seems to be working in a correct way.
The results of the model need to be examined. They can be assessed in the sense that they do or do not advance the current theory. Do they present a coherent and reasonable argument for what is occurring? Do they provide an opportunity for man to understand better? The logic of retroduction is allowed as a complement to induction and deduction.
The new results or explanations can be assessed in the light of the old ones and a case for acceptance or rejection can be made.
Such a regime, in my opinion is the basis for a new assessment paradigm. I am a mathematician by training and this is therefore a departure for me. Colleagues in the social sciences have maybe been using similar ones for many years. I look forward to correspondence in this subject and hope that agreement can be reached over the whole academic community.
Professor Alfredo MoscardiniUniversity of Sunderland, UK
A. Espinosa and A. LeonardGuest Editors