Cyberfactories – How News Agencies Produce News

Gianluca Miscione (School of Business, University College Dublin, Ireland)

Information Technology & People

ISSN: 0959-3845

Article publication date: 9 November 2012

300

Keywords

Citation

Miscione, G. (2012), "Cyberfactories – How News Agencies Produce News", Information Technology & People, Vol. 25 No. 4, pp. 438-440. https://doi.org/10.1108/09593841211278811

Publisher

:

Emerald Group Publishing Limited

Copyright © 2012, Emerald Group Publishing Limited


In Cyberfactories, Barbara Czarniawska focuses on the organizing processes of news production and on how the huge variety of information systems come together in practice, and affect content. The book provides rich empirical data and novel use of diverse literature from different disciplines. It builds on Callon's (1998) work that considers overflow and framing as mutually constitutive: “[e]specially poignant is Callon's observation that it is the very act of framing that produces overflow: otherwise it is merely a flow” (p. 2). The empirical material consists of three ethnographic case studies of three news agencies; TT in Sweden, ANSA in Italy and Reuters in UK, providing coverage for Europe, the Middle East and Africa. The discussion of these cases is informed by the metaphor of the “Matrix” (from the Wachowski brothers’ movie 1999), which also derives from the science fiction idea of cyberspace (Gibson, 1984).

The research approach preserved what may seem inconsistent or paradoxical from observation. The circularity of news production and the specific circuits according to which it is articulated provide explanatory analysis. The actor‐network theory's principle of symmetry (Latour, 2005), i.e. accounting for both humans and artefacts in describing the social, allowed Czarniawska to account for the variety of technologies and systems at work together. The analysis showed how they can be entangled or not in practice. Different cases and diverse situations concur in showing the circularity of news production. Four circuits are identified: mutual influence between news agencies and their clients, training of newcomers by the experienced, sources trying to influence news producers, and news producers being cause of the events they report. Although these circuits may shake the believes of those convinced that news objectively report a reality out there, the readers who are more inclined towards a social construction view on reality may not be surprised. Those who are familiar with Luhmann's work and system theory can recognize autopoietic processes behind the self‐accountability of the circuits above. What emerges clearly is that between increasing control exercised by technology and professionals defending what are the necessary “gut feelings” to do their job, little is left for a conception of news as neutral mirror of reality.

So, if “space is limitless”, as an interviewee says referring to lack of constraints in producing all possible news, why does news production work the way it does? Czarniawska's answer is that it seems to be constrained by the circularity of its own circuits. In several passages of the book, technology is described as being in perpetual flux, always in tension between some kind of in‐house participatory design (to facilitate quick uptake and reduce gaps from legacy systems) on one side, and continuous pressure to blackbox technology to make it seamlessly embedded in “revolver‐quick” news production on the other. Bowker and Star (1999) defined information infrastructures as classificatory systems. This notion proves central in the present book as it shows how the classification of news gains importance with increasing scale in terms of size and outreach of organizing processes. While at TT labelling of news does not feature prominently, at ANSA it is used as pigeon holes and at Reuters is paramount in maintaining a piece of news in (multiple) appropriate workflows from source to final clients, and/or their trading algorithms. At Reuters it is so important to have a copy of a piece of news in all relevant work processes that an auto‐coding software has been implemented to add labels which may be missing, on the principle of “better safe than sorry” because what is not categorized does not exist: “[t]his is one of the ways in which the world converges (messily, partially) with its representation” (p. 171, reported quote from Bowker, 2006).

In news agencies, where collective cognition plays out, cybernization and cyborgization are seen as emerging phenomena. Cybernization is defined as the “growing control of production by computers, and especially by software” (p. 28), cyborgization as “an even‐closer association between people and machines” (p. 198). These organizational phenomena are well illustrated by the example of robo‐trading, which had attracted attention just after the collapse of Lehman Brothers (Dooling, 2008). A considerable part of global financial trade is nowadays operated by (not just via) computers. This means that computers run algorithms that react (selling or buying) to information, also provided by Reuters. Usually, this information consists of market indexes. So, for example, when the oil price is higher than a threshold, a computer may buy shares of an energy company without any direct human intervention. For instance, this puts news about the Iranian embargo and changes in petrol companies’ value a few milliseconds apart. Still, it seems that trading algorithms react to market numbers, i.e. stock exchange indexes and prices. But we do not have to wait long before seeing financial investors adjusting their trading according to even more elusive indicators derived from the rising “sentiment analysis”: user‐generated data produced on social networks has originated the phenomenon of “big data”, i.e. large unstructured datasets that organizations collect, analyze and try to use. Bollen et al. (2010) showed that public mood – captured via Twitter – can predict stock markets. Although the consequences of such socio‐technical entanglements are being explored by practitioners such novel organizing forms continue to pose numerous theoretical questions. Indeed, human and non‐human market operators see price fluctuations also under the effects of algorithms, but they cannot discern human intentionality in price setting. It is a kind of Turing test problem for the collective cognition of price setting. Robo‐trading enmeshes two fundamentals of two rather distant disciplines in novel ways: the Turing test from artificial intelligence and the Thomas theorem, basis of constructivist social sciences. As in a Turing test, one can try to judge what is behind a curtain with abductions (a violent price change could be algorithmic), but what counts at the end of the day is that “if someone believes that something is real, it will be in its consequences” as the Thomas theorem says.

After a detailed and thought‐provoking journey through news production, Czarniawska discusses what she sees as the major consequences of cybernization and cyborgization: speed, standardization and centrality of software.

News agencies struggle continuously with information overload, however, their coping strategies work well enough to relegate this problem to the background of their concerns. Rather, speed gained the foreground in the case studies. As journalists and all actors involved have to be “revolver quick” in maintaining the pace of news production while triangulating sources and defending exclusivity, news agencies act in a sort of world wild west (my expression) created by the global outreach of the matrix. Standardization processes unfold in quite unexpected ways. In contrast to Neo‐institutionalist expectations, not much of circulation of consultants and software solutions seem to travel directly from one news agency to another. Distinct and unrelated socio‐technical arrangements tend to produce increasingly similar products. Actually, agencies compare each other on the basis of their outputs, and adjust their operations backward, showing a new process of isomorphism.

In conclusion, if the focus of traditional research on individual information systems prevents from seeing the forest from the trees, Cyberfactories looks at the forest of systems and their unfolding in news production. At a time when many disciplines started to take IT seriously, Walsham (2012) highlights the difficulties facing information systems as a discipline. He stresses the importance of welcoming approaches from different disciplines, widening the field of study to non‐traditional settings, increasing the credit of critical approaches and rejecting a dominant methodological paradigm. Researchers and practitioners of management, information systems, media and journalism will certainly find this book an appealing response to those calls.

References

Callon, M. (1998), “An essay on framing and overflowing: economic externalities revisited by sociology”, in Callon, M. (Ed.), The Laws of the Markets, Blackwell, Oxford, pp. 24469.

Dooling, R. (2008), “The rise of the machines”, New York Times, 11 October, available at: www.nytimes.com/2008/10/12/opinion/12dooling.html?pagewanted=all

Gibson, W. (1984), Neuromancer, Victor Gollancz, London.

Latour, B. (2005), Reassembling the Social: An Introduction to Actor‐Network‐Theory, Oxford University Press, Oxford.

Walsham, G. (2012), “Are we making a better world with ICTs?; reflections on a future agenda for the IS field”, Journal of Information Technology.

Related articles