Information misbehaviour: modelling the motivations for the creation, acceptance and dissemination of misinformation

Thomas D. Wilson (Swedish School of Library and Information Science, University of Borås, Borås, Sweden)
Elena Maceviciute (Swedish School of Library and Information Science, University of Borås, Borås, Sweden) (Faculty of Communication, Vilnius University, Vilnius, Lithuania)

Journal of Documentation

ISSN: 0022-0418

Article publication date: 21 September 2022

Issue publication date: 19 December 2022

1724

Abstract

Purpose

Misinformation is a significant phenomenon in today's world: the purpose of this paper is to explore the motivations behind the creation and use of misinformation.

Design/methodology/approach

A literature review was undertaken, covering the English and Russian language sources. Content analysis was used to identify the different kinds of motivation relating to the stages of creating and communicating misinformation. The authors applied Schutz's analysis of motivational types.

Findings

The main types of motivation for creating and facilitating misinformation were identified as “in-order-to motivations”, i.e. seeking to bring about some desired state, whereas the motivations for using and, to a significant extent, sharing misinformation were “because” motivations, i.e. rooted in the individual's personal history.

Originality/value

The general model of the motivations underlying misinformation is original as is the application of Schutz's typification of motivations to the different stages in the creation, dissemination and use of misinformation.

Keywords

Citation

Wilson, T.D. and Maceviciute, E. (2022), "Information misbehaviour: modelling the motivations for the creation, acceptance and dissemination of misinformation", Journal of Documentation, Vol. 78 No. 7, pp. 485-505. https://doi.org/10.1108/JD-05-2022-0116

Publisher

:

Emerald Publishing Limited

Copyright © 2020, Thomas D. Wilson and Elena Maceviciute

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


1. Introduction

For decades researchers have been exploring how we interact with information; various theories have been used, from activity theory to practice theory and independent theories have been evolved to explain various aspects of this interaction (e.g. a general information behaviour theory (Wilson, 2016); everyday life information seeking (Savolainen, 1995); the emotional side of information search (Kuhlthau, 2004; Nahl and Bilal, 2007); relevance (Saracevic, 2007); and task-based searching (Vakkari, 1999), etc.). The information behaviour of many actors in different situations is known, but what do we know about how we behave in relation to misinformation? Our aim is to discover what reasons motivate people to create and disseminate misinformation and to propose a theoretical framework for the further exploration of the phenomenon.

The earliest use of the word misinformation, according to the Oxford English Dictionary (2022), where it is defined as “Wrong or misleading information”, is in 1605, while disinformation is of more recent origin: being defined by the OED as: “The dissemination of deliberately false information, esp. when supplied by a government or its agent to a foreign power or to the media, with the intention of influencing the policies or opinions of those who receive it.” and the earliest quotation is from 1955.

In a paper on algorithmic methods for the detection of misinformation and disinformation, Søe concludes similarly: “More specifically, I define misinformation as unintended misleadingness, inaccuracy, or falsity, whereas disinformation is defined as intentional misleadingness, inaccuracy, or falsity.” (Søe, 2018, p. 321–322). We can also add a definition of “fake news” as news containing false or misleading content spread by fake news distributors with the intention to deceive or being indifferent to the truth, imitating journalistic formats of real news, deceiving the audience and disseminated virally through the online media, which we have found in English and Russian language research (Jester and Lanius, 2021; pp. 26–27; Galyashina, 2021, p. 18).

Misinformation has come to prominence mainly as a result of the lies of former US President, Donald Trump and his Republican supporters. We also see disinformation elevated to the status of state propaganda in Russian TV and officially demanded lies from all other media and even private citizens in relation to Russia's war against Ukraine. But the practice is not new; the production of documents to deceive goes back to ancient Mesopotamia, where fake wills were the subject of court cases (Michel, 2020). In Russia, Abraham Firkowich (1787–1874), invented a new history for Jews and Karaites [a Jewish sect] so well that he achieved the protection of the Karaite population from persecution (Shapira, 2020). The state sanctioned propaganda based on disinformation was observed during World War I and World War II, as well as in the era of the Cold War (Bennet, 2020). In fact, it was Adolf Hitler who explained how the “big lie” functions:

in the big lie there is always a certain force of credibility; because the broad masses of nation are more easily corruped in the deeper strata of their emotional nature than consciously or voluntarily; and thus in the primitive simplicity of their minds they more readily fall victims to the big lie than the small lie, since they themselves often tell small lies in little matters but would be ashamed to resort to large-scale falsehoods. (Hitler, 1939, Ch. X)

One of the problems of distinguishing information and misinformation was identified by an information scientist, Christopher Fox, who applied analytic philosophy to these two terms. He established that both information and misinformation, “need not be believed by anyone”, both “need not originate with a reliable informant … but with someone in an appropriate position to know”. Furthermore, he found that “information need not be true, though misinformation must be false” (Fox, 1983, p. 12). This makes it quite problematic to distinguish between the two, especially, as misinformation may be unintentional, i.e. the originator may believe it to be true.

Other authors in different disciplines have supplied other definitions of these concepts or different relationships between them (see, e.g. Watzlawick, 1977; Karlova and Fisher, 2013; Vraga and Bode, 2020; or Sunstein, 2021), but we will stick with the simple definitions of these concepts presented above.

Having in mind these definitions, we suggest using the term misinformation as the widest one, including all kinds of false content, whether intended or not and regardless of the format and channel of its dissemination. Disinformation and “fake news” both are limited in some way, the first by being intentionally spread as false information, the second as spread in a particular format imitating reliable messages. We focus on the existing research in the light of what it says about the information misbehaviour of involved actors and, in particular, about the reasons for such misbehaviour. In this respect we do not include in the concept of misinformation fantasy, science fiction, satirical or humorous and similar messages that are clearly visible and understood as a particular type or genre of information.

We characterise misinformation in the title of this paper as information misbehaviour, intending to typify the associated activities as pathological to some degree. That is, as untypical behaviour relative to norms of communication behaviour in society, where the information transferred by broadcast or print media, for example, is assumed to be reliable and true. Even in societies where this norm is generally followed, there may be instances of information misbehaviour, though unintentional spread of false information will not always fall into this concept, especially if it is not accompanied by sensationalism or lackadaisical disregard for the facts, as when a newspaper reports the words of a politician without checking the facts. As far as we are aware the term has not been used previously in information science, but we discovered that it has been used previously in other fields, for example, by Sardou (2021) and by Longo (2013). However, neither of these authors uses the term as we use it here: Sardou is concerned with the failure of corporations to abide by financial regulations and the resulting market misinformation and Longo is concerned with misbehaviour relating to cyber-security.

2. Modelling the relationships

Jia (2020) has reviewed five models of misinformation and disinformation that have been published in social science publications, mainly conference proceedings, but also journals. Their authors were modelling the actors, the factors supporting the spread of misinformation, the elements of misinformation communication and the elements of misinformation detection (Jia, 2020, p. 88). Wardle (2019, p. 8) has suggested that the motives of people creating, receiving and spreading the misinformation should be different. Two of the models have been found in information science journals. Rubin (2019) has used the triangle of disease spread used in epidemiology and saw the readers as susceptible hosts who lack resistance due to information-overload, lack of time and media literacy skills. Karlova and Fisher (2013) have created a complex model uniting most of the elements from all other models. Among other elements they have identified “deceivers”, “receivers” and diffusers and their respective motivated goals, reasons for believing and intents, which were marked as unknown. These models together with the material collected from Google Alerts were used as the basis to start thinking and discussing the problem.

We began by constructing an a priori model of the relationship between the creation of misinformation and its use (Figure 1). The communication sector of the diagram, on the right, suggests that the creation of misinformation is motivated in some way, e.g. to gain some benefit. However, a misinformation message could not reach its intended audience without facilitation. This may be an e-mail system, or a social media platform such as Facebook or Twitter. Without these facilitators the creator of misinformation could not reach an extended audience. The facilitators' motivation for providing their services is profit, since they survive on the income from advertisers.

The diffusion of misinformation enables use: the user has some motivation to engage with misinformation, may reject it as false, or may believe it and share it by retweeting or by forwarding an e-mail message to acquaintances. As a result, the misinformation may be refuted by concerned agencies (e.g. the National Health Service in the UK has collaborated with social media to combat misinformation about the Covid-19 vaccines (NHS takes action …, 2020)), or it may persist in cyberspace and continue to be accessed and believed.

Exploring misinformation from an information behaviour perspective draws our attention not only to personal information behaviour, but also to corporate information behaviour, such as YouTube's employment of its recommendation algorithm (Mozilla, 2021) and collective information behaviour, e.g. the actions of religious sects or quasi-political groups such as the Proud Boys (Wilson, 2020), as creators and disseminators of misinformation.

3. The nature of motivation

In Figure 1, we identified motives as the trigger for behaviour: that is, behind the decision to post an anti-vaccination message is some motive, which may not even be fully apparent to the person posting, but, nevertheless, must be present.

We use Schutz's (1962) typification of in-order-to and because motives: in-order-to motives relate to some desired future state, to be brought about by the action of the person, while because motives arise out of the individual's past condition and circumstances.

Schutz notes that in ordinary language, the types are confused and that we refer to in-order-to motives by using the word because. I may say of someone, “He is posting anti-vaccination information because he wants to alert people to potential harmful effects”. In fact, I should say, “He is posting anti-vaccination information in-order-to alert people to potential harmful effects”, as the person is seeking to bring about some desired future. A because motive would be expressed as, “He is posting anti-vaccination information because one of his children almost died following vaccination”. Here the person's motivation is in the past.

Schutz goes on to note that: “Motive may have a subjective and an objective meaning. Subjectively it refers to the actor who lives in his ongoing process of activity. To him, motive means what he has actually in view as bestowing meaning upon his ongoing action and this is always the in-order-to motive.” (Schutz, 1962, p. 70–71 [our emphasis])

The example above of the poster of anti-vaccination messages illustrates this: in the act of posting, the person has an in-order-to motivation; but reflecting upon his action, for example, when interviewed, he may reveal the because motive. Schutz notes that the objective meaning of the motive is found: “Only in so far as the actor turns to his past and, thus, becomes an observer of his own acts, can he succeed in grasping the genuine because motives of his own acts.” (p. 71).

Motives relating to misinformation have been investigated in several disciplines, such as communication studies (Herero-Diz et al., 2020), health information (Oxford English Dictionary, 2022; Apuke and Omar, 2020), psychology (Susmann and Wegener, 2022) and information science (Karlova and Fisher, 2013). Having this in mind, we set out to explore the motivations of information misbehaviour using Schutz's typification as a tool to explore them.

A number of research questions can be derived from Figure 1 and three are chosen for exploration in the existing literature:

  1. What motivates the creation of misinformation?

  2. What motivates someone to accept misinformation?

  3. What motivates someone to further disseminate misinformation?

4. Method

4.1 Data collection – Web of Science

When a search is carried out in Web of Science for fake news OR misinformation OR disinformation in the title field, it is evident that the Trump administration in the USA, with its promotion of “alternative facts” (Fandos, 2017), provided a stimulus for the growth of articles on the topic. Thus, in 2016, the year before his election, Web of Science recorded only seventy-four articles. From this point, the output grows: see Table 1.

Over the period, eighty-seven articles (2.2%) actually mention the ex-President, according to a search of the abstracts. Seventy-four million US citizens voted for him in the 2020 election and many of them continue to believe the lie that he won the election, which was “stolen” from him (Most Republicans …, 2021). It would be interesting to investigate how many of these supporters are still active in spreading misinformation.

The 3,998 records, for the period 2016–2021, were searched for the occurrence of the terms motive/s/ation, or intention/s in the abstracts, since examining every text for the same terms would have been much too time-consuming. This resulted in 372 articles and analysis of the abstracts resulted in 109 that were of apparent further interest. The 109 articles were distributed over the Web of Science research areas as shown in Table 2. Only those research areas with ten or more articles are shown and these account for 92% of the total:

4.2 Data collection – Russian research literature

Given the attempts from Russia to affect the results of elections in the USA, France and the UK (Daniels, 2017; Narayanan et al., 2017) and its massive propaganda disinformation, related to the war in Ukraine, aimed at its own population and the international public, we thought it useful to carry out a search of Russian research literature to see how the subject was treated by Russian scholars. Google Scholar was searched using a variety of queries (fal'shivyje novosti, feikovyje novosti, poddel'nyje novosti, misinformacija, dezinformacija and their combinations). As the basis for search is different in Google Scholar we get rather higher returns; however, the years of publication on the first ten pages of articles in each search range from 2011 to 2021 with the highest number of hits in 2017–2021. To some extent this is confirmed by the statistics of the largest open science database in Russia, Cyberleninka (cyberleninka.ru). For the query “fal'shyvyje novosti” (on January 1, 2022) it shows the following numbers (Table 3):

It seems that the interest in Russian language research was triggered by the accusations of Russian meddling with the Brexit process, although Trump's “feikovyje novosti” related to his Russian contacts also made an impact (Zuikina and Sokolova, 2019). Research areas in Cyberleninka are set, the articles assigned to them and there is no possibility to identify how the classification areas overlap. However, the texts are distributed over the following research areas (Table 4):

There is an obvious overlap between the research areas, but it is interesting that the largest number of articles is assigned to linguistics. This can be attributed to the studies of emergence and development of linguistic discourse of information warfare in the political sphere, but also in economics and ideology (Kushneruk and Chudinov, 2019). These studies were omitted as well as the reviews that are opinion pieces without proper references to literature or cases. In terms of topics discussed in relation to the query categories, we can see that in response to the query misinformacija, the hits are mainly related to historical studies of the phenomenon. The remaining query words return the publications exploring mainly modern cases related to the presidential elections in the US, Brexit, the Covid-19 pandemic and the Russian-Ukraine conflict (before the start of the war).

Further information was collected from Google and Google Scholar Alerts created to monitor current journalism and scholarly contributions. The Google Alert generated notifications at least once a day, signifying the prevalence of the topic in the daily news of many countries.

4.3 Analysis

The bibliographic data and abstracts for all 109 papers, which included variants of motive, retrieved from Web of Science were downloaded to an Endnote library. Papers on technological solutions to the problem of misinformation, popular journalism, bibliometric studies of the growth of the literature in the field and a variety of other topics that were deemed irrelevant to the present exercise were removed. The remaining papers were reviewed and, where specific motives were identified and explored in the research, the concepts were recorded and standardised. Many papers simply used the term motive or motivation but only in a general context, for example, the statement, “our inability or lack of motivation to recognise and challenge misinformation” (Yeo and McKasy, 2021) occurs in a paper that does not discuss motivations to create, use or disseminate misinformation. A number of the papers also use the term in relation to the specific theory of motivated reasoning.

More useful were those papers that identified specific motives, e.g. in a statement such as, “individuals who report hating their political opponents are the most likely to share political fake news and selectively share content that is useful for derogating these opponents” (Osmundsen et al., 2021), the phrase “useful for derogating these opponents” is an aspect of the concept achieving or retaining political power.

Papers from the Russian research literature were treated in a similar way. Cyberleninka allows access to the first 100 full-text documents registered in the system related to the query. Most of the hits to the query “fal'shivyje novosti, feikovyje novosti” related to the linguistic characteristics of the fakes, ways of counteracting them by legal, technological or other means, historical studies or were reviews of previous literature and had little or no bearing to the issue of motives in creating, accepting or spreading fake news. Only 22 were relevant to our research questions on motivation to create or use misinformation and, thus, we have chosen them for analysis.

Related identified motives were then sorted under the concepts, such as political power or economic gain, or need for belonging, etc. (as noted above). The number of articles related to a particular concept had no bearing to the creation of a concept. Thus, even one motive could constitute a concept if nothing similar could be found. This is consistent with a qualitative approach that we have adopted in our literature study. Then each concept was interpreted by each author of this article in terms of Schutz's types of motives (because of and in order to). The overlap of interpretations between two authors was identical. This could be explained by deep discussions of the Schutz's typifications carried out by the authors before the study.

5. Results

5.1 What motivates the creation of misinformation?

The most obvious set of motivations for creating misinformation relate to its use by politicians to gain or retain power. The events surrounding the 2020 Presidential election in the USA brought the subject sharply into focus. President Trump's promotion of misinformation during his presidency, set the tone for the election and subsequent events. The Washington Post database of Trump's misleading and false claims (or just lies) (https://www.washingtonpost.com/politics/interactive/2021/timeline-trump-claims-as-president/?itid=lk_inline_manual_10) ultimately totalled 30,573 items; the numbers increasing year by year of the presidency. It is rather ironic that while promoting misinformation he labelled the authentic news sources as “fake news”. Thus, it seems that Trump sought to deflect attention from his own actions, to denigrate the work of opposition politicians, to pander to his support base and to gain support for re-election – all falling into the category of gaining and sustaining political power.

Where a nation's leader goes, it is likely that others will follow: Slevin (2021) reported on the 2020 election results in Iowa where the Republicans:

drove turnout to unexpected levels by crafting blunt-force narrative anchored in puffery and lies when it came to Trump caricature and when it came to Democrats … and it worked, especially in rural counties, where Trump and the G.O.P. won by significant margins.

Just as the party machine may operate in this way, individual politicians are also misinformation spreaders. Most notably, Republican Marjorie Taylor Greene,

was one of 147 Republicans who voted in favor of objections to the results of the presidential election in early January and has falsely claimed there was widespread voter fraud. She has praised QAnon, a baseless conspiracy theory that claims Democrats and Hollywood celebrities are Satan-worshipping, cannibalistic pedophiles. (Funke, 2021)

No doubt the motivation for her political untruths is the desire to appeal to what now appears to be the core Republican voter. The motivation for propagating other nonsensical ideas is related to the same political purpose, since they are clearly designed to denigrate the opposition.

Domestic politicians are not the only sources of misinformation in relation to the US elections: in the US and elsewhere Russia has been active in spreading misinformation online. Russia's use of cyberspace to seek to influence political developments in the West is nothing new; in 2017 the then UK Prime Minister Theresa May accused Russia of “meddling in elections and planting fake stories in the media in an attack on its attempts to ‘weaponise information’ in order to sow discord in the west” (Mason, 2017).

Russia also attempted to influence the Brexit referendum in the UK (Narayana et al., 2017), as well as the 2016 and the 2020 US Presidential elections (Berghel, 2017) and the French election in 2017 (Daniels, 2017). In the case of France, the attempt was frustrated by the structure of the election process and the role of an independent body in countering the attacks (Conley and Vilmer, 2018).

Thus, misinformation or, rather, disinformation is a tool of information warfare in the international arena. Totalitarian regimes, such as in Nazi Germany or the Soviet Union were especially masterful in such disinformation. Similar political powers have understood and used modern network technology for undermining the political systems of competitors and redirecting the attention of the world-wide public and opponents from their own aggressive or unlawful actions (Berghel, 2017; Daniels, 2017).

A side-effect of such disinformation is also strengthening political positions inside the countries. This is different from the previous examples, as political actors in this case are not publicly visible. The complex and invisible state mechanisms employ professional and voluntary workers who do the job for personal gain or to avoid sanctions (Vasu et al., 2018, pp. 10–13). We can also see that, in Russia, research on misinformation in politics either focuses on the historical origins of the phenomenon (Archangelskaja and Archangelskaja, 2020), or relates to the resurrection of cold war tactics in the atmosphere of deteriorating international relations (Romanov and Shabaev, 2020) and explores cases outside Russia (Koshkin, 2020). In some cases, they present Russian disinformation accusing foreign actors of discrediting Russia by falsehood (Il'ichova and Kondrashov, 2018). For example, the case of poisoning of Skripal’ and his daughter is presented as “political manipulation aiming to dehumanise Russia” (Brusenskaja and Kulikova, 2019, p. 105). As the researchers do not really present any empirical proof in these cases, they participate in spreading falsehoods themselves.

In the political sphere, as shown here, the motivation for the creation of misinformation is political: an individual, or a party, or an interested organisation will create misinformation to divert attention from their own behaviour, to denigrate the opposition parties and to persuade the electorate of the relevance of their position and the suggested ill-effects of the opposition's manifesto. All of which can be summarised as to gain or retain political power.

The so-called “Christian Right” is also closely associated with political right-wing fake news and misinformation in the USA, indeed Douglas (2018) attributes the origin of fake news to the religious fundamentalists who reject Darwin's theory of evolution, as well as “the historical-critical method of Bible scholarship” (p. 6). Their motivation is the preservation of what they believe to be the traditional values of American society and the defence of the Old Testament account of the origins of humankind. Members have also been promoters of conspiracy theories, most notably those of the QAnon organisation. Graves and Fraser-Rahim note:

a significant number of evangelical Christians have, through social media, descended into QAnon’s conspiratorial depths. This phenomenon appears to be, thus far, unique to white evangelical Christians. QAnon myths are infused with anti-Muslim and anti-Semitic rhetoric. Theories abound with claims that mask ordinances are a part of a long conspiracy by Muslims to install sharia law within the United States and that underground cabals of child sex slaves are funded by Jewish investor George Soros. (Graves and Fraser-Rahim, 2021).

In Russian research, we find a serious empirical research by Zuikina and Sokolova (2019), where the authors have identified a number of fake stories about Ukraine spread in Russian media after 2014 (e.g. the crucification of a boy in Slaviansk; poverty and payment for public transport in salt; the exclusion of Crimea from Ukraine by the Ukrainian authorities), in which the misinformation is used to justify the political aims and actions of the communicator and meet emotional needs of the internal, Russian, audience regardless of their origin or truth (Zuikina and Sokolova, 2019, p. 13).

The economic motivation is also brought to bear in the political sphere: perhaps the most notable example of this is the misinformation “factory” in the small town of Veles in Macedonia. Here, Mirko Ceselkoski runs the Facebook Marketing University, training mainly young men to make money through attracting visitors to their Facebook pages or websites (Hughes and Waismel-Manor, 2020). Ceselkoski claims to have helped Donald Trump to win the 2016 election in the USA, although it is uncertain what effect was created by the fake news generated by his trainees. Given that the monthly net wage in the town is about $360, the economic imperative is clear and it is said that some residents have made tens of thousands of dollars and even more than a million.

The economic motivation is also present in research on misinformation related to competition, such as false advertisements, as in the case of face-cream promotion by J. Berzvershenko (Sarkisiants and Riabova, 2019). The desire of the media to attract attention for commercial gain and saving resources, without investing in serious journalistic work and fact checking, is also often mentioned in Russian research (Dorofejeva, 2019; Sukhodolov and Bychkova, 2017).

Another major area of misinformation is in medicine and health care. This has come to the fore recently as a result of the Covid-19 pandemic and has resulted, for example, in President Biden saying that the misinformation available on Facebook was “killing people” (Rodriguez, 2021). Biden subsequently modified his statement, commenting that “Facebook isn't killing people, these 12 people are out there giving misinformation” (Kelly, 2021). The twelve people referred to are the “disinformation dozen” (Center for Countering Digital Hate, 2021). The Center's research discovered that, “Content attributed to members of the Disinformation Dozen had been posted or shared 503,896 times, representing 73.1% of the total anti-vaccine posts represented by our sample”.

Although twelve people are named in the report there are three couples: Joseph Mercola (number one) is the husband of Erin Elizabeth (number seven), Ty and Charlene Bollinger are number three and Kelly Brogan (number nine) is the wife of Sayer Ji (number eight). In almost all cases these people are selling alternative health products and their motivation is clearly commercial gain. Fedorov et al. (2020) have identified a similar motivation of promoting certain products against infection or attracting attention and increasing social capital by offering easy solutions for recovery in Russian language media.

In the case of Robert F. Kennedy, Jr., however, there is no such motivation: Kennedy promotes his anti-vaccination ideas and associated conspiracy theories, through his organisation, Children's Health Defense (https://childrenshealthdefense.org) and he seems to genuinely believe that vaccines are harmful to children and adults.

Rizza Islam posts misinformation on the Covid-19 vaccine, but is also antisemitic and homophobic. He promotes conspiracy theories, such as claiming that Bill Gates was involved in planning the pandemic and that the vaccine makes women infertile. He has more than 500,000 social media followers, most of whom will be in the African-American community (Center for …, 2021). It would seem that the motivation here is that of gaining personal prestige in his community and particularly in the organisation Nation of Islam (https://www.noi.org), of which he is a prominent member (Rizza Islam, 2021).

The situation can be summed up in the words of Pennycook and Rand (2020): “Bullshit, such as in the case of fake news, is constructed to garner attention (or advertising revenue) or achieve some sort of social or political gain (regardless of its truthfulness).” (p. 191)

In theoretical terms, the motivations for the production of some form of misinformation, are what Schutz (1962, p. 69) calls in-order-to motives; that is, the desire to achieve some end. As Schutz, notes, “From the point of view of the actor this class of motives refers to his future”: the creator of misinformation is seeking to bring about an increase in sales, an increase in personal prestige, or some other desired future state.

At this point we can expand the model to show the motivations to create misinformation (Figure 2).

5.2 What motivates someone to accept misinformation?

The literature selected from the Web of Science and Cyberleninka revealed little devoted to the motivation to accept misinformation; some factors do emerge, however: for example, Agley and Xiao (2021) studied people's belief in narratives about Covid-19. Four “profiles” emerged from the analysis: the largest group of respondents (70.15%) believed the scientific evidence and scored the alternative, conspiracy theories low. The remaining three profiles were more complex, however; for example, profile 2 respondents scored all five statements highly, apparently being prepared to believe the conspiracy theories while also believing the science.

Profile 1 respondents differ from those in profile 2, in having a more liberal political orientation and much lower religious commitment (a mean score of 3.84, versus 7.56), suggesting that conservative political views and religious belief may motivate the acceptance of misinformation. Religious fundamentalism was also identified by Bronstein et al. (2019) as affecting the acceptance of misinformation, along with delusional ideation, dogmatism and reduced analytical thinking. Badrinathan and Chauchard (2021) in a study carried out in India also identify religiosity as a factor in belief in conspiracy theories.

Political ideology emerges as one of the most significant factors affecting the acceptance of misinformation; for example in work by Reedy et al. (2014), Uscinski et al. (2016) and Winter et al. (2016). In a study of the German elections of 2017, Zimmermann and Kohring (2020) (who use the term disnews for disinforming news) noted that,

former CDU/CSU supporters were more likely to refrain from electing this party the more they believed disinformation. Instead, these voters tended to choose either the AfD or the SPD … However, the impact of disnews on voting for the AfD becomes much stronger and more significant for CDU/CSU supporters with right-wing attitudes. (p. 230)

Jusha's (2019) experimental study has confirmed that the 40% of its participants did not verify a false message if they were seeing it many times as it was easier to take it for granted than check. Repeated messages made people believe them without checking and without regard to contradictory knowledge and experience (p. 186). Lack of a critical attitude in acceptance of misinformation was mentioned by Dorofejeva (2019) and media illiteracy by Sukhodolov (2017) and Zuikina and Sokolova (2019).

Reduced analytical thinking has also been identified by Pennycook and Rand (2019, 2020) as related to the acceptance of misinformation in the form of fake news: using the cognitive reflection test (Frederick, 2005) they note that:

individuals who are more willing to think analytically when given a set of reasoning problems (i.e. two versions of the Cognitive Reflection Test) are less likely to erroneously think that fake news is accurate. (Pennycook and Rand, 2019, p. 46–47)

Pennycook and Rand (2021, p. 399) note that, “Rather than being bamboozled by partisanship, people often fail to discern truth from fiction because they fail to stop and reflect about the accuracy of what they see on social media”, suggesting that political partisanship has a weaker effect on the acceptance of misinformation and that acceptance is more dependent upon previously established beliefs. Bago et al. (2020) carried out an experiment to determine whether rational deliberation affected belief in fake news, rather than motivated reasoning (Kunda, 1990), where a person's reasoning is affected by a need to protect their self-image or political ideology. A similar phenomenon is noticed by Fedorov et al. (2020) confirming that people seek news confirming their established views and opinions or to comply with the perceived consensus without regard to the origin and verisimilitude of information.

On the other hand, Baptista et al. (2021), in a comparison of right-wing and left-wing supporters in Portugal, found “a greater tendency for right-wing party participants to accept fake news and news that confirms their beliefs” and “we can confirm that partisanship, like political ideology, shows that people who belong to right-wing parties are more vulnerable to believing in fake news.” (pp. 36–37).

Increased emotionality has also been identified as influencing the acceptance of misinformation (Martel et al., 2020), that is, “momentary emotion regardless of the specific type or valence of emotion is predictive of increased belief in fake news and decreased discernment between real and fake (p. 15) and, “the more relied on emotion over reason. the more they perceived fake stories as accurate” (p. 15). Russian researchers refer to anxiety and emotional instability caused by uncertainty about a situation (Manoylo and Popadiuk, 2020), or even mass psychosis caused by lack of trust in official media and the sensationalism of fakes spread through information channels (Archakova, 2020).

We can now elaborate the initial model in respect to the motivations that lead to the acceptance of misinformation:

It will be clear that the motives here are rather different from those that underlie the creation of misinformation: they are what Schutz (1962) describes as because motives. Such motives arise out of the actors' past experiences, their education, their family background, their social background and their biographically determined situation. Figure 3 presents these motivations.

5.3 What motivates someone to further disseminate misinformation?

In 2018 the journal Science published the results of the investigation about the diffusion of false news on Twitter between 2006 and 2017 (Vosoughi et al., 2018). They show that, “The top 1% of false news cascades diffused to between 1,000 and 100,000 people, whereas the truth rarely diffused to more than 1,000 people” (p. 1146). The influence of false political news was deeper than any other (e.g. natural disasters, financial, or urban legends). They controlled for the activity of robots and found that they had the same effect on spread on both false and true stories, thus confirming that false news is spread more efficiently and wider by human actions. Looking for the explanation of their results, the authors speculate that false news was experienced as more novel and provoking stronger emotional responses than true stories, which may be why they are disseminated further.

Research shows that modern information technologies, especially social media, make the dissemination of news so easy that it often happens before the meaning of the message is grasped and carries with it an “implicit endorsement that comes with sharing” (Lazer et al., 2018, p. 1095), “just because they can” (Zuikina and Sokolova, 2019, p. 19). The failure to reason and evaluate the content before sharing was also found in the study on Covid-19 misinformation by Pennycook et al. (2020). The motives for accepting and believing misinformation identified in the earlier section, such as the emotional response (Martel et al., 2020) or political or religious partisanship (Douglas, 2018; Reedy et al., 2014), also motivate people to share and disseminate misinformation.

Although there are few studies into the actual motives for sharing misinformation, some have identified its predictors and reasons. A systematic review of literature on health-related misinformation (Wang et al., 2019) found that the basic law of rumour can be applied to its spread online: rumour circulation depends on the importance of the subject to individuals multiplied by the number of conflicting messages of equal credibility on the topic (or the ambiguity of the evidence) (Allport and Postman, 1947). Thus, the propagators of anti-vaccination messages are characterised by vaccine aversion, increasing importance of communication, by knowledge deficiency (Krishna, 2016) and mistrust in government and pharmaceutical companies (Xu and Guo, 2018), which affects the credibility of the messages sent by these bodies.

Sharing health or any other misinformation online is influenced by individual and collective psychological factors, such as epistemic beliefs, feelings of fear or hope (Cua and Banerjee, 2017) and social homogeneity. However, it is more problematic than simply an inability to assess the accuracy of the news. Having found that, despite correctly assessing the accuracy of messages, the study participants still claimed that they will share them, Pennycook et al. (2021) state:

(i) people do care about accuracy more than about other content dimensions, but accuracy nonetheless often has little impact on sharing because (ii) the social media context focuses their attention on other factors such as desire to attract followers/friends, or to signal one's group membership. (p. 3).

The social aspect was also identified in a recent study of the motives to share Covid-19 misinformation on the Nigerian social media (Apuke and Omar, 2021a, b). The authors found that altruistic persons who wish to help others are more likely to share Covid-19 misinformation on social media and that a general inclination to seek and share information is a strong predictor of sharing misinformation. Thus, people gain gratification through helping others, seeking social acceptance and passing time in finding and exchanging information. The same wish to help or warn others by sharing misinformation was found in a survey of social media users in Russia and also correlated with the importance of the issue and perceived credibility of a sender (Romanov and Shabaev, 2020, p. 17). On the other hand, self-promotion also plays a role (Apuke and Omar, 2021a).

Considering fake news relating to Covid-19, Balakrishnan et al. (2021) tested six variables relating to motives for sharing fake news and found that altruism, ignorance and entertainment (i.e. “the use of technologies simply for the fun of it or for escapism” (p. 4)) were positively and significantly associated with a willingness to share fake news. Akhmadeev and Bresler (2021, p. 93) have found that satirical news from the website Panorama were not only shared on Facebook in the form of real news, but also enhanced and embellished for fun.

Thus, to the earlier motives, we can add that further dissemination of misinformation relates to the perceived importance of the subject to the person, reduced analytical thinking, perceived ambiguity of the evidence, distrust of the originators of scientific information, an altruistic wish to help others, need for social belonging, self-promotion, general inclination to seek and share information and entertainment. These factors are set out in Figures 4 and 5:

The majority of these motivational factors are because motives. Thus, misinformation is disseminated because of the person's pre-existing distrust of authority, or because of their reduced capacity for analytical thinking, or because of a pre-existing desire to serve others and so on for the other factors. The motivation of the religious fundamentalist might be rather more complex: they may spread misinformation in-order-to bring about more general acceptance of their fundamentalism, but, as Schutz (1962, p. 71) notes, there may be a because motive behind the in order to motive and in this case a person may spread misinformation because of their pre-existing beliefs.

6. Discussion and conclusions

As a result of the findings, we can elaborate the initial a priori model:

This model is built on the assumed sequence of creation, acceptance and dissemination of information, as presented in Figure 1. The main part of it relates to the disclosed motives of the people taking part in each process belonging to the sequence. Thus we see that the assumption by Wardle (2019) about the difference of motives in each phase should be different is confirmed and we have managed to identify and name the unknown motives present in the model by Karlova and Fisher (2013).

The most immediately obvious feature of the model is that the motivations for accepting and sharing misinformation are more numerous and more complex than those that underlie the creation and facilitation of misinformation. We do not suggest that the model includes all possible motivations and an analysis of a larger body of documents would undoubtedly reveal more. However, the types of motivations as suggested by Schutz are established with sufficient variation. The dataset comprising two rather different parts has helped to show that very similar motives of creation, acceptance and dissemination of misinformation are found by researchers working in two different traditions of social science.

Regardless of the completeness of the model, however, it is useful to note the different types of motivation in terms of Schutz's typification introduced earlier in this paper. It is evident that the motivations for the creation of misinformation are predominantly in-order-to motives: the desire for political power, or personal prestige, or corporate gain, are all motives of this type, since they relate to an anticipated future. The only because motive we found is that of true belief, since it refers to the pre-existing state of mind of the creator.

The motivations for the facilitation of misinformation by making platforms such as Facebook and Twitter available are also, clearly, in-order-to motives, mainly to gain commercial or personal profit, or political power. The owners behind the facilitating platforms may promote noble ideals, such as Facebook's mission “to give people the power to build community and bring the world closer together” (Facebook, 2019), but, as a recent whistleblower suggests (Timberg, 2021), the company is really more interested in profit. Some other facilitators may be contributing to creation or spread of misinformation through lack or saving of resources on competence and allocation of enough time for preparation of materials, or for the reason of gaining prestige.

While some in-order-to motives may exist for the acceptance of misinformation, those found in the literature are predominantly because motives. This is understandable, since, at the point of acceptance, the viewer has no further motive in mind other than to decide upon the truth value of the information, for them. Consequently, their previous understanding of the topic, their religious or political bias, their stock of knowledge (Schutz, 1962, p. 20–21) on the subject, are the factors that determine acceptance or rejection.

As noted earlier, the motives for the further dissemination and sharing of misinformation appear to be more numerous and more complex than in the other cases.

The model prompts a number of research questions that could be explored further, for example:

  1. In relation to the willingness to accept misinformation as true, some research exists on the credibility of the senders (Karlsen and Aalberg, in press) or the validity of the content (van der Linden et al., 2017). More research linking creators, facilitators and end users may bring better understanding of the decisions to accept or reject misinformation.

  2. Which particular motives play the crucial role in sharing misinformation? Which motives are involved in which situations or areas of misinformation, e.g. health vs. politics, climate change vs. cybersecurity and so on?

  3. The interaction of because and in-order-to motives. If we accept Schutz's comment that, in the act, the actor's motive is always in-order-to, researchers should always explore the underlying because motives. Accepting a user's description of their motives while at the keyboard and screen would result in an inadequate understanding of the true motivations for the actions.

In conclusion, we believe that the use of Schutz's typification of motives provides a useful mode of analysis for understanding the motivations of those who create, facilitate, accept and share misinformation. The resulting conceptual model of Figure 5 presents a framework for the further exploration of the phenomenon and suggests avenues of investigation that could throw further light on the information misbehaviour of the actors.

Figures

A priori model of the creation and use of misinformation

Figure 1

A priori model of the creation and use of misinformation

Categories of motives to create misinformation

Figure 2

Categories of motives to create misinformation

Categories of motives to accept misinformation

Figure 3

Categories of motives to accept misinformation

Categories of motives to disseminate information

Figure 4

Categories of motives to disseminate information

Elaborated model including all categories of motives

Figure 5

Elaborated model including all categories of motives

Growth of publications on misinformation

Year20172018201920202021Total
No. of publications2324207791,1491,4183,998

Distribution of papers over Web of Science research areas

CommunicationComputer scienceGovernment & lawScience/technologyInformation sciencePsychologyTotal
352414131211109

Publications in Russian

Year201720192020–2021
No. of publications621446453

Publications in Russian over Cyberleninka research areas

Linguistics and literature researchMedia and mass communicationHistory and archaeologyPhilosophy, ethics, religious studiesPolitical scienceLawArt studiesBusiness and economicsSociologyEducology
3442902271141109484702118

References

Agley, J. and Xiao, Y. (2021), “Misinformation about COVID-19: evidence for differential latent profiles and a strong association with trust in science”, BMC Public Health, Vol. 21 No. 1, 89.

Akhmadeev, K.N. and Bresler, M.G. (2021), “Fenomen ‘fake news’ informacionnogo/cifrovogo obshchestva v diskurse setevogo bytija (Fake news phenomenon of information/digital society in the discourse of networked life)”, Znak: Problemnoje Pole Mediaobrazovanija, Vol. 3 No. 41, pp. 91-102.

Allport, G.W. and Postman, L. (1947), The Psychology of Rumor, Henry Holt, New York, NY.

Apuke, O.D. and Omar, B. (2020), “Modelling the antecedent factors that affect online fake news sharing on COVID-19: the moderating role of fake news knowledge”, Health Education Research, Vol. 35 No. 5, pp. 490-503, doi: 10.1093/her/cyaa030.

Apuke, O.D. and Omar, B. (2021a), “User motivation in fake news sharing during the COVID-19 pandemic: an application of the uses and gratification theory”, Online Information Review, Vol. 45 No. 1, pp. 220-239.

Apuke, O.D. and Omar, B. (2021b), “Fake news and COVID-19: modelling the predictors of fake news sharing among social media users”, Telematics and Informatics, Vol. 54, 101475.

Archakova, M.A. (2020), “‘Fejki’ I memy covid-19: neumestnaja terapija ili celitel’nyj smechooptimism (Covid-19 ‘fakes’ and memes: unacceptable therapy or therapeutic laugh optimism)”, Kommunikologija: elektronnij nauchnij zhurnal, Vol. 5 No. 4, pp. 61-68.

Arkhangelskaja, I.B. and Arkhangelskaja, A.S. (2020), “Feik-njus v docifrovuju i cifrovuju epokhu [Fake news in before-digital and after-digital ages]”, Znak: problemnoje pole mediaobrazovanija, Vol. 3, pp. 95-104.

Badrinathan, S. and Chauchard, S. (2021), “Conspiracy theories and miracle cures: fighting COVID-19 misinformation in India”, Paper presented at the Southern Political Science Association 2021 Annual Meeting Online, January 6-9, 2021.

Bago, B., Rand, D.G. and Pennycook, G. (2020), “Fake news, fast and slow: deliberation reduces belief in false (but not true) news headlines”, Journal of Experimental Psychology: General, Vol. 129 No. 8, pp. 1608-1613.

Balakrishnan, V., Ng, K.S. and Rahim, H.A. (2021), “To share or not to share – the underlying motives of sharing fake news amidst the Covid-19 pandemic in Malaysia”, Technology in Society, Vol. 66, 101676.

Baptista, J.P., Correia, E.R., Alves, A.G. and Piñeiro-Naval, V. (2021), “Partisanship: the true ally of fake news? A comparative analysis of the effect on belief and spread”, Revista Latina de Comunicación Social, No. 79, pp. 23-46.

Bennet, G. (2020), “Propaganda and disinformation: how a historical perspective aids critical response development”, in Paul, B., O’Shaughnessy, N. and Snow, N. (Eds), The SAGE Handbook of Propaganda, SAGE Reference, London, pp. 245-260.

Berghel, H. (2017), “Oh, what a tangled web: Russian hacking, fake news and the 2016 US Presidential election”, Computer, Vol. 50 No. 9, pp. 87-91.

Bronstein, M.V., Pennycook, G., Bear, A., Rand, D.G. and Cannon, T.D. (2019), “Belief in fake news is associated with delusionality, dogmatism, religious fundamentalism and reduced analytic thinking”, Journal of Applied Research in Memory and Cognition, Vol. 8 No. 1, pp. 108-117.

Brusenskaja, L.A. and Kulikova, E.G. (2019), “Fake kak element manipulirovanija obshchestvennym soznaniem (Fake news as an element of manipulation with social consciousness)”, Humanitarnyje i Social’nyje Nauki, No. 5, pp. 101-112.

Center for Countering Digital Hate (2021), The Disinformation Dozen. Why Platforms Must Act on Twelve Online Anti-vaxxers, Center for Countering Digital Hate, London, available at: https://252f2edd-1c8b-49f5-9bb2-cb57bb47e4ba.filesusr.com/ugd/f4d9b9_b7cedc0553604720b7137f8663366ee5.pdf (accessed 19 March 2022).

Chua, A.Y.K. and Banerjee, S. (2017), “To share or not to share: the role of epistemic belief in online health rumors”, International Journal of Medical Informatics, Vol. 108, pp. 36-41.

Conley, H.A. and Vilmer, J.-B.J. (2018), “Successfully countering Russian electoral interference”, Center for Strategic & International Studies, available at: https://www.csis.org/analysis/successfully-countering-russian-electoral-interference (accessed 19 March 2022).

Daniels, L. (2017), “How Russia hacked the French election”, Politico, 23 April, available at: https://www.politico.eu/article/france-election-2017-russia-hacked-cyberattacks/ (accessed 19 March).

Dorofejeva, V.V. (2019), “Fejkovyje novosti v sovremennom media prostranstve (Fake news in modern media space)”, Theoretical and practical issues of Journalism, Vol. 8 No. 4, pp. 774-786.

Douglas, C. (2018), Religion and Fake News: Faith-Based Alternative Information Ecosystems in the U.S. and Europe, Cambridge Institute on Religion and International Studies, Cambridge, available at: http://ciris.org.uk/wp-content/uploads/2018/02/TPNRD-Religion-and-Fake-News.pdf (accessed 19 March 2022).

Facebook (2019), FAQs, Facebook, Menlo Park, CA, available at: https://investor.fb.com/resources/default.aspx (accessed 19 March 2022).

Fandos, N. (2017), “Fact-checking the White House ‘alternative facts’”, The Seattle Times, 22 January, available at: https://www.seattletimes.com/news/fact-checking-the-white-house-alternative-facts/ (accessed 19 March 2022).

Fedorov, A.V., Levickaja, A.A. and Novikov, A.S. (2020), “Koronavirus kak istochnik medijnykh manipuliacyi [Coronavirus as a source of media manipulation]”, Crede experto: transport, obshchestvo, obrazovanije, jazyk, Vol. 2, pp. 69-80.

Fox, C.J. (1983), Information and Misinformation: An Investigation of the Notions of Information, Misinformation, Informing and Misinforming, Greenwood Press, Westport, CT.

Frederick, S. (2005), “Cognitive reflection and decision making”, Journal of Economic Perspectives, Vol. 19 No. 4, pp. 25-42.

Funke, D. (2021), “What Rep. Marjorie Taylor Greene has said about election fraud, QAnon and other conspiracy theories”, Poynter, 3 February, available at: https://www.poynter.org/fact-checking/2021/what-rep-marjorie-taylor-greene-has-said-about-election-fraud-qanon-and-other-conspiracy-theories/ (accessed 19 March 2022).

Galyashina, E.I. (2021), “Feiking kak novaja ugroza mediabezopasnosti: lingvojuridicheskij effekt (‘Faking’ as a new threat to media security: lingvo-jurist aspect)”, Etnopsicholingvistika, Vol. 2 No. 5, pp. 7-24.

Graves, M. and Fraser-Rahim, M. (2021), “The U.S. needs deradicalization–for Christian extremists”, Foreign Policy, 23 March, available at: https://foreignpolicy.com/2021/03/23/usa-needs- qanon-deradicalization-christian-extremists/ (accessed 19 March 2022).

Herrero-Diz, P., Conde-Jimenez, J. and Reyes de Cozar, S. (2020), “Teens' motivations to spread fake news on WhatsApp”, Social Media + Society, Vol. 6 No. 3, pp. 1-14. doi: 0.1177/2056305120942879 (accessed 19 March 2022).

Hitler, A. (1939), Mein Kampf (Translated by James Murphy), London, Hurst and Blackett, available at: https://gutenberg.net.au/ebooks02/0200601h.html.

Hughes, H.C. and Waismel-Manor, I. (2020), “The Macedonian fake news industry and the 2016 US election”, Political Science and Politics, Vol. 54 No. 1, pp. 19-23, available at: https://doi.org/10.1017/S1049096520000992 (accessed 25 March 2022).

Il’ichova, L.E. and Kondrashov, A.O. (2018), “Fal’shyvije novosti kak instrument informacionnovo protivoborstva [Fake news as a tool of information warfare]”, Gosudarstvennaja sluzhba, Vol. 6, pp. 77-81.

Jaster, R. and Lanius, D. (2021), “Speaking of fake news: definitions and dimensions”, in Bernecker, S., Flowerree, A.K. and Grundmann, T. (Eds), The Epistemology of Fake News, Oxford University Press, Oxford, pp. 19-45.

Jia, F. (2020), “Misinformation literature review: definitions, taxonomies and models”, International Journal of Information Science and Education Research, Vol. 3 No. 12, pp. 85-90, doi: 10.6918/IJOSSER.202012_3(12).0011.

Jusha, A.E. (2019), “Metody verifikaciji informaciji v period postpravdy (Information verification methods in the period of post-truth)”, Mediasreda, Vol. 1, pp. 181-187.

Karlova, N.A. and Fisher, K.E. (2013), “A social diffusion model of misinformation and disinformation for understanding human information behaviour”, Information Research, Vol. 18 No. 1, 573, available at: http://InformationR.net/ir/18-1/paper573.html (accessed 19 March 2022).

Karlsen, R. and Aalberg, T. (in press), “Social media and trust in news: an experimental study of the effect of Facebook on news story credibility”, Digital Journalism, available at: https://doi.org/10.1080/21670811.2021.1945938 (accessed 19 March 2022).

Kelly, M. (2021), “Joe Biden says Facebook isn't ‘killing people’, but misinformation causes harm”, The Verge, 19 July, available at: https://www.theverge.com/2021/7/19/22583809/joe-biden-facebook-covid19-coronavirus-vaccine-misinformation-killing-people (accessed 19 March 2022).

Koshkin, P. (2020), “Informacionno-politicheskije aspekty pandemii koronovirusa v Sojedinionych Shtatah [Information and political aspects of the Covid-19 pandemic in the United States]”, Puti k miry i bezopasnosti, Vol. 2 No. 59, pp. 120-132, available at: https://doi.org/10.20542/2307-1494-2020-2-120-132 (accessed 19 March 2022).

Krishna, A. (2016), “Motivation with misinformation: conceptualizing lacuna individuals and publics as knowledge-deficient, issue-negative activists”, Journal of Public Relations Research, Vol. 29 No. 4, pp. 176-193.

Kuhlthau, C. (2004), Seeking Meaning: A Process Approach to Library and Information Services, Libraries Unlimited, Santa Barbara, CA.

Kunda, Z. (1990), “The case for motivated reasoning”, Psychological Bulletin, Vol. 108 No. 3, pp. 480-498.

Kushneruk, S.L. and Chudinov, A.P. (2019), “Stanovlenije lingvistiki informacionno-psichologicheskoj vojny: metodologicheskaja neodnorodnost’ i pervyje rezultaty (Linguistics of information-psychological war in the making: methodological heterogeneity and first results)”, Ekologija jazyka I kommunikativnaja praktika, Vol. 4 No. 1, pp. 105-118.

Lazer, D.M.J., Baum, M.A., Benkler, Y., Berinsky, A.J., Greenhill, K.M., Menczer, F., Metzger, M.J., Nyhan, B., Pennycook, G., Rothschild, D., Schudson, M., Sloman, S.A., Sunstein, C.R., Thorson, E.A., Watts, D.J. and Zittrain, J.L. (2018), “The science of fake news”, Science, Vol. 359 No. 6380, pp. 1094-1096.

Longo, B. (2013), “Learning on the wires: BYOD, embedded systems, wireless technologies and cybercrime”, Legal Information Management, Vol. 13 No. 2, pp. 119-123.

Manoylo, A.V. and Popadiuk, A.E. (2020), “Zarubezhnyje nauchnyje podhody k issledovaniju ‘feikovyh novostej’ v mirovoj politike (Foreign scientific approaches to the study of ‘fake news’ in world politics, Rosija i Sovremennyj Mir, Vol. 2 No. 107, pp. 285-300.

Martel, C., Pennycook, G. and Rand, D.G. (2020), “Reliance on emotion promotes belief in fake news”, Cognitive Research: Principles and Implications, Vol. 5, article no. 7.

Mason, R. (2017), “Theresa May accuses Russia of interfering in elections and fake news”, The Guardian, available at: https://www.theguardian.com/politics/2017/nov/13/theresa-may-accuses-russia-of-interfering-in-elections-and-fake-news.

Michel, C. (2020), “Cuneiform fakes: a long history from antiquity to the present day”, in Michel, C. and Friedrich, M. (Eds), Fakes and Forgeries of Written Artefacts from Ancient Mesopotamia to Modern China, De Gruyter, Berlin, pp. 25-60, available at: https://www.degruyter.com/document/doi/10.1515/9783110714333-002/pdf (accessed 25 March 2022).

“Most Republicans still believe 2020 election was stolen from Trump – poll” (2021), The Guardian, available at: https://www.theguardian.com/us-news/2021/may/24/republicans-2020-election-poll-trump-biden (accessed 25 March 2022).

Mozilla Foundation (2021), YouTube Regrets. A Crowdsourced Investigation into YouTube's Recommendation Algorithm, Mozilla Foundation, San Francisco, CA, available at: https://assets.mofoprod.net/network/documents/Mozilla_YouTube_Regrets_Report.pdf (accessed 25 March 2022).

Nahl, D. and Bilal, D. (Eds) (2007), Information and Emotion. The Emergent Affective Paradigm in Information Behavior Research and Theory, Information Today, Inc, Medford, NJ.

Narayanan, V., Howard, P.N., Kollanyi, B. and Elswah, M. (2017), Russian Involvement and Junk News during Brexit, Oxford Internet Institute, Oxford University, (COMPROP data memo 2017.10) available at: https://blogs.oii.ox.ac.uk/wp-content/uploads/sites/93/2017/12/Russia-and-Brexit-v27.pdf (accessed 25 March 2022).

“NHS takes action against coronavirus fake news online” (2020), Available at: https://www.england.nhs.uk/2020/03/nhs-takes-action-against-coronavirus-fake-news-online/ (accessed 25 March 2022).

Osmundsen, M., Bor, A., Vahlstrup, P., Bechmann, A. and Petersen, M. (2021), “Partisan polarization is the primary psychological motivation behind political fake news sharing on Twitter”, American Political Science Review, Vol. 115 No. 3, pp. 999-1015.

Pennycook, G. and Rand, D.G. (2019), “Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning”, Cognition, Vol. 188, pp. 39-50.

Pennycook, G. and Rand, D.G. (2020), “Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity and analytic thinking”, Personality, Vol. 88 No. 2, pp. 185-200.

Pennycook, G. and Rand, D.G. (2021), “The psychology of fake news”, Trends in Cognitive Sciences, Vol. 25 No. 5, pp. 388-402.

Pennycook, G., McPhetres, J., Zhang, Y. and Rand, D. (2020), “Fighting COVID-19 misinformation on social media: experimental evidence for a scalable accuracy nudge intervention”, Psychological Science, Vol. 31 No. 7, pp. 770-780.

Oxford English Dictionary (2022), Misinformation, Oxford University Press, available at: https://www.oed.com.

Pennycook, G., Epstein, Z., Mosleh, M., Arechar, A.A., Eckles, D. and Rand, D. (2021), “Shifting attention to accuracy can reduce misinformation online”, Nature, Vol. 592, pp. 590-595, available at: https://doi.org/10.1038/s41586-021003344-2 (accessed 26 March 2022).

Reedy, J., Wells, C. and Gastil, J. (2014), “How voters become misinformed: an investigation of the emergence and consequences of false factual beliefs”, Social Science Quarterly, Vol. 95 No. 5, pp. 1399-1418.

Rizza Islam (2021), “In Wikipedia”, available at: https://en.wikipedia.org/wiki/Rizza_Islam (accessed 26 March 2022).

Rodriguez, S. (2021), “Biden on Facebook: ‘They’re killing people’ with misinformation”, CNBC, available at: https://www.cnbc.com/2021/07/16/white-house-says-facebook-needs-to-do-more-to-fight-vaccine-misinformation.html (accessed 26 March 2022).

Romanov, A.A. and Shabaev, V.V. (2020), “‘Fake news’ v social’nych setiach, blogach i mesendzherach kak ugroza informacionnoj bezopasnosti [‘Fake news’ in social networks, blogs and instant messengers as a threat to the information security]”, Jurist-Pravoved, Vol. 1, pp. 16-20.

Rubin, V.L. (2019), “Disinformation and misinformation triangle: a conceptual model for ‘fake news’ epidemic, causal factors and interventions”, Journal of Documentation, Vol. 75 No. 5, pp. 1013-1034, doi: 10.1108/JD-12-2018-0209.

Saracevic, T. (2007), “Relevance: a review of the literature and a framework for thinking on the notion in information science. Part III: Behavior and effects of relevance”, Journal of the American Society for Information Science and Technology, Vol. 58 No. 13, pp. 2126-2144.

Sardou, O.T. (2021), Regulatory Sanctions and Their Effects on the Stock Market. An Analysis of the Effects of Finansinspektionens Sanctions on the Sanctioned Firms' Stock Price, [Unpublished Bachelor’s degree thesis], Stockholm Business School, available at: https://www.diva-portal.org/smash/get/diva2:1637576/FULLTEXT01.pdf (accessed 26 March 2022).

Sarkisiants, V.R. and Riabova, M.V. (2019), “Fejkovyje novosti: kommunikativnyj I lingvojuridiicheskij aspekty (Fake news: communicative and lingualegas aspects)”, Humanitarnyje i Socialnyje Nauki, No. 6, pp. 209-220.

Savolainen, R. (1995), “Everyday life information seeking: Approaching information seeking in the context of ‘way of life’”, Library & Information Science Research, Vol. 17 No. 3, pp. 259-294.

Schutz, A. (1962), “Choosing among projects of action”, Collected Papers. Volume I the Problem of Social Reality, Martinus Nijhoff, Leiden, pp. 67-96.

Shapira, D. (2020), “Et tout le reste est littérature, or: Abraham Firkowicz, the writer with a chisel”, in Michel, C. and Friedrich, M. (Eds), Fakes and Forgeries of Written Artefacts from Ancient Mesopotamia to Modern China, De Gruyter, Berlin, pp. 173-194, available at: https://www.degruyter.com/document/doi/10.1515/9783110714333-002/pdf (accessed 26 March 2022).

Slevin, P. (2021), “The power of political disinformation in Iowa”, The New Yorker, available at: https://www.newyorker.com/news/campaign-chronicles/the-power-of-political-disinformation-in-iowa (accessed 26 March 2022).

Søe, S.O. (2018), “Algorithmic detection of misinformation and disinformation: Gricean perspectives”, Journal of Documentation, Vol. 74 No. 2, pp. 309-332.

Sukhodolov, A.P. (2017), “Fenomen ‘feikovych novostej’ v sovremennom mediaprostranstve (The phenomenon of ‘fake news’ in the modern mediaspace)”, Evroaziatskoje Sotrudnichestvo: Humanitarnyje Aspekty, No. 1, pp. 87-106.

Sukhodolov, A.P. and Bychkova, A.M. (2017), “Fake news as a modern media phenomenon: definition, types, role of fake news and ways of counteracting it”, Theoretical and Practical Issues of Journalism, Vol. 6 No. 2, pp. 143-469.

Sunstein, C.R. (2021), Liars: Falsehoods and Free Speech in an Age of Deception, Oxford University Press, Oxford.

Susmann, M.W. and Wegener, D.T. (2022), “The role of discomfort in the continued influence effect of misinformation”, Memory and Cognition, Vol. 50, pp. 435-448.

Timberg, C. (2021), “New whistleblower claims Facebook allowed hate, illegal activity to go unchecked”, Washington Post, available at: https://www.washingtonpost.com/technology/2021/10/22/facebook-new-whistleblower-complaint/ (accessed 26 March 2022).

Uscinski, J.E., Klofstad, C. and Atkinson, M.D. (2016), “What drives conspiratorial beliefs? The role of informational cues and predispositions”, Political Research Quarterly, Vol. 69 No. 1, pp. 57-71.

Vakkari, P. (1999), “Task-based information searching”, Annual Review of Information Science and Technology (ARIST), Vol. 37, pp. 413-464.

van der Linden, S., Leiserowitz, A., Rosenthal, S. and Maibach, E. (2017), “Inoculating the public against misinformation about climate change”, Global Challenges, Vol. 1 No. 2, 1600008, available at: https://doi.org/10.1002/gch2.201600008 (accessed 26 March 2022).

Vasu, N., Ang, B., Teo, T.A., Jayakumar, S., Faizal, M. and Ahuja, J. (2018), Fake News: National Security in the Post-truth Era. Policy Report 2018, Nanyang Technology University, available at: https://www.rsis.edu.sg/wp-content/uploads/2018/01/PR180313_Fake-News_WEB.pdf (accessed 26 March 2022).

Vosoughi, S., Roy, D. and Aral, S. (2018), “The spread of true and false news online”, Science, Vol. 359 No. 6380, pp. 1146-1151.

Vraga, E.K. and Bode, L. (2020), “Defining misinformation and understanding its bounded nature: using expertise and evidence for describing misinformation”, Political Communication, Vol. 37 No. 1, pp. 136-144, doi: 10.1080/10584609.2020.1716500.

Wang, Y., McKee, M., Torbica, A. and Stuckler, D. (2019), “Systematic literature review on the spread of health-related misinformation on social media”, Social Science and Medicine, Vol. 240, 112552, available at: https://doi.org/10.1016/j.socscimed.2019.112552 (accessed 26 March 2022).

Wardle, C. (2019), Understanding Information Disorder, FirstDraft, supported by Google News Initiative, available at: https://firstdraftnews.org/wp-content/uploads/2019/10/Information_Disorder_Digital_AW.pdf?x76701.

Watzlawick, P. (1977), How Real is Real? Confusion, Disinformation, Communication, Vintage books, New York, NY.

Wilson, T.D. (2016), “A general theory of human information behaviour”, Proceedings of ISIC, the Information Behaviour Conference, Zadar, Croatia, 20-23 September, 2016, Part 1. Information Research, Vol. 21 No. 4, isic1601, available at: http://InformationR.net/ir/21-4/isic/isic1601.html (accessed 26 March 2022).

Wilson, J. (2020), “Proud Boys are a dangerous ‘white supremacist’ group say US agencies”, The Guardian, available at: https://www.theguardian.com/world/2020/oct/01/proud-boys-white-supremacist-group-law-enforcement-agencies (accessed 26 March 2022).

Winter, S., Metzger, M.J. and Flanagin, A.J. (2016), “Selective use of news cues: a multiple motive perspective on information selection in social media environments”, Journal of Communication, Vol. 66 No. 4, pp. 669-693.

Xu, H. and Guo, Z. (2018), “Using text mining to compare online pro- and anti-vaccine headlines: word usage, sentiments and online popularity”, Communication Studies, Vol. 69 No. 1, pp. 103-122.

Yeo, S.K. and McKasy, M. (2021), “Emotion and humor as misinformation antidotes”, Proceedings of the National Academy of Sciences of the United States of America, Vol. 118, No. 15, e2002484118, available at: https://doi.org/10.1073/pnas.2002484118 (accessed 25 March 2022).

Zimmermann, F. and Kohring, M. (2020), “Mistrust, disinforming news, and vote choice: a panel survey on the origins and consequences of believing disinformation in the 2017 German Parliamentary Election”, Political Communication, Vol. 37 No. 2, pp. 215-237, doi: 10.1080/10584609.2019.1686095.

Zuikina, K.L. and Sokolova, D.V. (2019), “Specifika kontenta Rosijskich feikovych novostej v internete i televideniji [Specific content of Russian fake news on the internet and the TV]”, Vesti Moskovskovo universiteta, Zhurnalistika, Serija 10, Vol. 4, pp. 3-22, available at: https://doi.org/10.30547/vestnik.journ.4.2019.322 (accessed 26 March 2022).

Acknowledgements

No funding was received for the preparation of this paper.

The authors thank Professor Reijo Savolainen for helpful comments on an earlier version of this paper and the anonymous referees for further comments that have resulted in improvements to the text.

Corresponding author

Thomas D. Wilson can be contacted at: wilsontd@gmail.com

Related articles