Digital Technology and Voice: How Platforms Shape Institutional Processes Through Visibilization

Ali Aslan Gümüsay (Humboldt Institute for Internet and Society, Germany)
Mia Raynard (WU Vienna University of Economics and Business, Austria)
Oana Albu (Copenhagen Business School, Denmark)
Michael Etter (King's College London, UK)
Thomas Roulet (University of Cambridge, UK)

Digital Transformation and Institutional Theory

ISBN: 978-1-80262-222-5, eISBN: 978-1-80262-221-8

ISSN: 0733-558X

Publication date: 23 September 2022


Digital technologies, and the affordances they provide, can shape institutional processes in significant ways. In the last decade, social media and other digital platforms have redefined civic engagement by enabling new ways of connecting, collaborating, and mobilizing. In this article, we examine how technological affordances can both enable and hinder institutional processes through visibilization – which we define as the enactment of technological features to foreground and give voice to particular perspectives and discourses while silencing others. We study such dynamics by examining #SchauHin, an activist campaign initiated in Germany to shine a spotlight on experiences of daily racism. Our findings show how actors and counter-actors differentially leveraged the technological features of two digital platforms to shape the campaign. Our study has implications for understanding the role of digital technologies in institutional processes as well as the interplay between affordances and visibility in efforts to deinstitutionalize discriminatory practices and institutions.



Gümüsay, A.A., Raynard, M., Albu, O., Etter, M. and Roulet, T. (2022), "Digital Technology and Voice: How Platforms Shape Institutional Processes Through Visibilization", Gegenhuber, T., Logue, D., Hinings, C.R.(B). and Barrett, M. (Ed.) Digital Transformation and Institutional Theory (Research in the Sociology of Organizations, Vol. 83), Emerald Publishing Limited, Leeds, pp. 57-85.



Emerald Publishing Limited

Copyright © 2022 Ali Aslan Gümüsay, Mia Raynard, Oana Albu, Michael Etter and Thomas Roulet


Published by Emerald Publishing Limited. These chapters are published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of these chapters (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at


In recent years, large protests against ethnic violence have erupted around the world – particularly in the United States, where the deaths of Black Americans including George Floyd and Breonna Taylor have unleashed a flood of criticism and civil unrest. Amidst the escalating anger and calls of “No justice, no peace,” social injustice and racial divisions have taken center stage. What the expansive scope and momentum of movements such as #BlackLivesMatter have taught us is that digital technologies – and particularly social media – are changing the face of politics and activism (Ouellette & Banet-Weiser, 2018). Individuals, organizations, and activist groups are increasingly taking to social media and other digital platforms to raise awareness of systemic racism and to call for the deinstitutionalization of this deeply ingrained problem (Gantt Shafer, 2017; Matamoros-Fernández, 2017).

Digital platforms are online, on-demand systems that have the potential to harness and create large scalable networks of users and resources (Castells, 1998). By providing expansive and immediate connectivity (van Dijck, 2013), digital platforms have become sites of interaction, debate, and conflict that represent a heterogeneity of “norms, values, expectations, and concerns” (Etter, Colleoni, Illia, Meggiorin, & D’Eugenio, 2018, p. 61). Disparate communities – each with their own interests and agendas – are able to come together and engage in various forms of co-creation, ranging from spontaneous (Albu & Etter, 2016) to more orchestrated iterations (Etter & Vestergaard, 2015; Gegenhuber & Naderer, 2019). Such new ways of connecting, collaborating, and mobilizing (Dobusch & Schoeneborn, 2015; Vaast & Kaganer, 2013) have facilitated an aggregation of “voices” in ways that can significantly shape institutional processes (Etter, Ravasi, & Colleoni, 2019; Illia et al., 2022; Roulet, 2020; Scheidgen, Gümüsay, Günzel-Jensen, Krlev, & Wolf, 2021; Wang, Raynard, & Greenwood, 2021). As certain voices are aggregated, they are foregrounded and made visible – while others are pushed to the background, potentially becoming unseen and unvoiced (Hudson, Okhuysen, & Creed, 2015). Thus, the act of making something visible involves an interplay between discursive openness and discursive closure, because the struggle to promote a particular view of reality often has the effect of subordinating equally plausible ones (Clemente & Roulet, 2015; Deetz, 1992; Leonardi & Jackson, 2004).

Our interest in this article is to explore the implications of digital technologies for voice, visibility, and institutions. Specifically, we aim to understand how technology can enable and hinder institutional processes through visibilization – which we define as the enactment of technological features to foreground and give voice to particular perspectives, positions, and discourses while silencing or subordinating others. We do so by examining the emergence of #SchauHin, a campaign in Germany that sought to bring daily experiences of systemic racism into the public sphere. Drawing upon multiple data sources and first-hand accounts from those involved in the campaign, we unpack the various ways in which users effected visibilization and influenced the development of the campaign and its goal of contributing to the deinstitutionalization of systemic racism. By showing how users differentially used and appropriated technological features to open and close discourses, this study aims to advance research at the intersection of technology and institutional theory in two ways.

First, it contributes to a relational understanding of technology by emphasizing its affordances, i.e., “the action possibilities and opportunities that emerge from actors engaging with technologies” (Faraj & Azad, 2012, p. 238). Digital platforms create opportunities to mobilize power and collective action, not through their “objective” features but through their ability to enable expansive, immediate connectivity and the distributed creation and dissemination of content and knowledge (van Dijck, 2013). In our case, initiators and supporters of the campaign engaged in a discursive struggle with “counter-actors” who sought to disrupt mobilization – with each side enacting platform properties in radically different ways. By showing how this struggle played out, our study extends understandings of “affordances-in-practice” (Costa, 2018) and shows how users “reconcile their own goals with the materiality of a technology” (Leonardi, 2011, p. 154).

Second, the study sheds further light on how technology can influence institutional processes (Hinings, Gegenhuber, & Greenwood, 2018) by zooming in on a specific affordance of technology: visibility. Visibility is conceptualized as a “root-affordance” on which other affordances are built (Treem, Leonardi, & van den Hooff, 2020, p. 45; cf. also Flyverbom, Leonardi, Stohl, & Stohl, 2016). Our case builds on this conceptualization by examining how platform features are activated by different sets of actors. Specifically, we show how activation can, on the one hand, generate visibility by opening up discourses about daily racism and, on the other, obscure visibility through the manipulation of content and sowing confusion (Etter & Albu, 2020; Treem et al., 2020). In addition, we show how digital platforms have their own “enactment” properties – as the algorithms and hidden information architectures embedded in digital platforms (Hansen & Flyverbom, 2015) can curate and make some knowledge, behaviors, and preferences visible and others less so. Thus, visibility, as an affordance, has both relational and strategic qualities that are enacted in the process of “seeing and being seen” (Brighten, 2007, p. 325). Our case illuminates these qualities and their implications for enabling or hindering reflection and the critique of intangible aspects of institutions – in our case, systemic racism.

On a practical level, our article demonstrates how digital technologies – and platforms in particular – have fundamentally altered civic engagement. Not only do these platforms have the potential to amplify and silence voices (Clemente & Roulet, 2015; Etter & Albu, 2020), they can also facilitate or hinder reflection on and action toward taken-for-granted practices and arrangements.

Theoretical Framework

Institutional Processes, Visibility, and Digital Platforms

It can be argued that the emergence, change, and decline of institutions requires institutionalized practices and arrangements to be made visible (Clemente & Roulet, 2015; Washington & Ventresca, 2004). Studies of institutional emergence, for example, have shown that increasing visibility of the limits or general failings of present institutional arrangements can lead to a mobilization of power and collective action by “champions of new practices and forms” (Schneiberg & Lounsbury, 2017, p. 284; see also Hoffman, 1999; Rodner, Roulet, Kerrigan, & Vom Lehn, 2020; Zietsma, Groenewegen, Logue, & Hinings, 2017). As practices become habits and objectively accepted by the masses, they become visible and in other terms identifiable (Tolbert & Zucker, 1999). Such visibility has also been shown to trigger processes of deinstitutionalization – notably by prompting reflexivity and (re-)examination of taken-for-granted arrangements and social practices (Dacin & Dacin, 2008; Maguire & Hardy, 2009; Seo & Creed, 2002).

While visibility can enhance the salience of certain practices, voices, and meanings that are manifested in institutional arrangements (Clemente & Roulet, 2015), it may also subordinate or divert attention away from others. This subordination of alternative ways of “doing” or “being” often contributes to processes of institutional maintenance because the voices of marginalized actors are suppressed or pushed into obscurity (Hudson et al., 2015; Mair & Martí, 2009). In this way, visibility and obscurity represent two sides of the same coin – with both shaping institutional processes in significant ways.

Within institutional scholarship, the concept of visibility is often only implicitly acknowledged – in part because institutional arrangements are understood to be supported by intangible sets of beliefs and values (Thornton & Ocasio, 2008) or by discursive productions that are not necessarily accessible to or consumable by all parties (Phillips & Oswick, 2012). Many foundational pillars of institutional arrangements are taken for granted, which makes their very nature invisible, even for those who enact them. Recently, however, studies have begun to emphasize visible material manifestations of institutions as “part of the way in which social processes and organizations are enacted and stabilized” (Monteiro & Nicolini, 2015, p. 61). Practices typically have, for example, a material aspect (Jones, Boxenbaum, & Anthony, 2013) that makes them visible to others (Boxenbaum, Jones, Meyer, & Svejenova, 2018) and, further, makes an actor’s engagement with an institution visible and the monitoring of practice diffusion possible (Chandler & Hwang, 2015). Another stream of related research has shown how actors make their beliefs and values “seen” by voicing them (Cornelissen, Durand, Fiss, Lammers, & Vaara, 2015). Together, these streams of research suggest that actors’ discursive productions are a reflection of their interaction with institutions (Meyer, Jancsary, Höllerer, & Boxenbaum, 2018; Wang et al., 2021), and that through reflexive interactions, audiences may become aware of the structures underpinning institutions (Gray, Purdy, & Ansari, 2015; Raynard, Kodeih, & Greenwood, 2020).

Whereas the visibility of practices, voices, and meanings has traditionally been limited by the “spatial and temporal properties of the here and now,” the development of information technologies has brought “a new form of visibility” (Thompson, 2005, p. 35). By enabling expansive connectivity, decentralized content creation, and distributed content aggregation, social media and other digital platforms have opened up opportunities for a wider range of actors to affect institutional processes (Etter et al., 2018; Illia et al., 2022). Marginalized actors, for example, are able to leverage diverse media to air grievances and raise awareness of endemic problems and social injustices (Harmon, 2019; Toubiana & Zietsma, 2017). Thus, whereas visibility and voice had previously been understood as a privilege of the large and powerful – i.e., those with high status, positions of authority, or control over important and extensive resources (Deephouse & Carter, 2005; Roulet, 2020), social media has leveled the playing field to some extent (Etter et al., 2018, 2019; Seidel, Hannigan, & Phillips, 2020). In particular, digital media platforms have provided an influential “podium” for marginalized actors (Wright, Meyer, Reay, & Staggs, 2020), while making large and powerful actors more vulnerable to intensive and widespread scrutiny (Daudigeos, Roulet, & Valiorgue, 2020; den Hond & de Bakker, 2007). In this sense, institutional arrangements may be more easily challenged or maintained, even by marginal actors.

Another important change brought on by social media is that it has increased the velocity of content dissemination by enhancing the speed and direction of communication (Castelló, Etter, & Nielsen, 2016; Etter et al., 2019; Wang, Reger, & Pfarrer, 2021). Hidden practices and events can be made public, often instantaneously or with very short time lags (Thompson, 2005). An illustrative example can be seen in how social media has enabled widespread exposure of police violence against Black people, thereby generating awareness and triggering collective mobilization (Ramsden, 2020). The increased velocity of content dissemination has, thus, helped overcome temporal and spatial distance by enabling direct engagement with communities who would otherwise have remained difficult to reach through traditional channels (Breuer, Landman, & Farquhar, 2015; Heavey, Simsek, Kyprianou, & Risius, 2020).

As a result of this change in scope and velocity, social media discourses have become increasingly intrusive, unwieldly, and hard to control (Altheide, 2013; Wang et al., 2021). Indeed, the fluid and diffuse nature of social media communities make the control of content and exposure highly challenging (Etter et al., 2019; Roulet, 2020). As Heavey and colleagues (2020, p. 1494) point out, “because communication boundaries are porous on social media, messages targeted at one audience may spillover to others and have a raft of unintended consequences.” Thus, while digital platforms can help actors open up discourses in ways that can mobilize collective action and tackle problematic aspects of institutions (Albu & Etter, 2016; Thompson, 2005), they can also lead to discursive closure, both intentionally and unintentionally (Etter & Albu, 2020).

In the next section, we build upon the above-presented insights on visibility and institutional processes, situating them within an affordance-based perspective on technology. We then pull together insights from these different areas of research to develop the concept of visibilization.

Technological Affordances and Visibilization

The widespread adoption of digital platforms for organizing has raised compelling questions about the ways in which these technologies affect processes of coordination and collaboration (Barberá-Tomás, Castelló, de Bakker, & Zietsma, 2019; Gegenhuber & Naderer, 2019; Leonardi, 2014; Leonardi & Vaast, 2017; Madsen, 2016; Seidel et al., 2020; Treem & Leonardi, 2013). The visibility afforded by digital platforms is commonly assumed to facilitate the transmission of information. However, recent studies also suggest that such visibility may have negative implications, as it paradoxically generates closure through information overload (Chen & Wei, 2019) and algorithmic distortion (Etter & Albu, 2020). It is thus important to elucidate how visibilization gives voice to particular perspectives, positions, and discourses while silencing or subordinating others. This is particularly important in order to further unpack the dark side of, or the negative social consequences associated with, digitalization (Trittin-Ulbrich, Scherer, Munro, & Whelan, 2021).

To gain a richer understanding that takes nuanced forms of visibility into account, we adopt an affordance perspective that pays particular attention to socio-materiality (Leonardi, 2012). From such a standpoint, it is the interplay or imbrication (Leonardi, Huysman, & Steinfield, 2013) of the separate but interacting actors – be they social (i.e., users) or material (i.e., digital platforms) – that facilitates the opening and closure of discourses. The material features of technologies (e.g., deleting, adding, or sharing functions) enable particular ways of creating and diminishing the visibility of discourses. At the same time, social actors or users – having different intentions and capabilities – can affect visibility in ways that open up or close down discourses. For example, through their use of these technologies, social actors can coordinate activities, persuade public opinion, or disturb collective action through negative, antisocial, thrill-seeking behavior (Cook, Schaafsma, & Antheunis, 2018). Thus, it is the relational interplay between features and contextual use that gives visibility to voices.

Recently, scholars have highlighted that visibility should also be understood from the receiver’s perspective, namely for whom content becomes (in-)visible (Treem et al., 2020). Indeed, some communication is only visible to a small in-group or to actors who inhabit a semi-public sphere, while being invisible to many others. For social movements and activists, these questions are important, as content can be targeted at small or even hidden groups for reasons of coordination (Albu, 2019; Uldam & Kaun, 2018), or it can be targeted at larger audiences with the aim of mobilization (Bennett & Segerberg, 2012). Again, it is the interplay between features and contextual use that shapes the different forms of visibility and closure.

Furthermore, scholars have highlighted the mediating role of algorithms as central to the forms of visibility and opaqueness specific to digital platforms (Milan, 2015). Algorithms can be understood as “sets of coded instructions” (van Dijck & Poell, 2013, p. 5) or “formalized rules embedded in technological artifacts” (Coretti & Pica, 2018, p. 73) that have an “entangled, complex, and dynamic agency” (Glaser, Pollock, & D’Adderio, 2021, p. 2) given the co-constitution of technological features and social practices. Algorithms impact what becomes visible as much as what becomes invisible on social media (Hansen & Flyverbom, 2015). They do so by performing “sorting, filtering, and ranking functions” (Neumayer & Rossi, 2016, p. 4) that steer attention and interactions (van Dijck & Poell, 2013) or overrepresent certain forms of interaction and devalue others (Bucher, 2012; Gillespie, 2014; Rieder, 2012). Research has shown that algorithms may work against users’ aims of making certain discourses visible (Poell & van Dijck, 2015) while closing others (Etter & Albu, 2020; Uldam & Kaun, 2018). Indeed, organizations that run social media platforms are often profit oriented and have designed algorithms to provide visibility to certain content with the goal of increasing user engagement for purposes of data collection and advertising (Gillespie, 2014).

Overall, then, we understand the visibilization process as one accomplished by the interplay of openness and closure. This emerges from the interaction of specific digital platform features (e.g., Twitter hashtags powered by algorithms, wiki pages, etc.) and human actors’ contextual intentions and use (e.g., the democratic participation and freedom of speech promoted by activists). Visibilization, in other words, is accomplished by human and nonhuman actors (Latour, 1996) – including the underlying algorithmic and informational architectures of digital platforms (e.g., trending hashtags, newsfeeds). This affordance-based perspective sensitizes scholars to the interplay between the materiality of technology and users’ varying intentions, the combination of which can enhance or obscure the visibility of practices, voices, and meanings that underpin institutional arrangements.


Research Context

The features of particular technologies, combined with their contextual use, create diverse forms of (in-)visibility. To better understand these patterns, we traced the emergence of the #SchauHin campaign in Germany, which sought to raise awareness of systemic racism in everyday interactions. As the campaign touched upon the highly debated issue of racism in German society, it attracted the attention of counter-actors, who sought to preempt and hinder its development. We selected the #SchauHin campaign as a paradigmatic case study (Flyvbjerg, 2006), which provides a window into understanding technological affordances and their potential role in institutional processes. The nature and development of the campaign, in particular, provided an opportunity to examine how digital platforms generate both visibility and closure for different discourses. We focused on a 16-month period from September 2013 until December 2014 – however, we continued to observe the case and collect data until June 2020. The idea for the campaign was initially discussed on Twitter and then moved to Titanpad – a digital, real-time collaborative text editing and writing platform that existed from 2010 to 2017. Although Titanpad facilitated a deeper engagement and development of ideas among organizers and supporters, counter-actors soon gained access and began disrupting development efforts. In response to this disruption, the campaign moved, again, back to Twitter – which, as a microblogging and social network platform, offered a very different set of technological features than Titanpad.

Due to the fact that the campaign moved across different digital platforms, and because groups of users appropriated the same technological features in divergent ways, #SchauHin provides an illuminating case in which to study how technology shapes institutional processes. For our purposes, it is an ideal context for understanding visibilization and how the appropriation of platform features can create discursive openness and closure.

Data Sources

This study draws on both internal and external data sources of the campaign. We were given access to #SchauHin organizers’ internal documents and data files, which included internal memos, strategy documents, and email exchanges. These data amounted to over 2,000 pages of visuals and text. We also examined data from the Titanpad platform and took screenshots at various points in time. Additionally, we examined the #SchauHin and #SchauHin2 Twitter profiles, manually screening 800 tweets with the hashtag #SchauHin. To supplement these data, we collected an additional 18 media articles and 14 videos that covered the campaign.

Data Analysis

To understand how the different groups of users utilized technological features to influence the campaign with its goal of drawing attention to systemic racism, we employed a qualitative analytic approach (Eisenhardt, 1989; Yin, 1994). As our case could be classified as a digital social movement, we were initially interested in how the digital nature of the social movement impacted organizing and mobilization. However, the emergence of counter-actors who sought to disrupt #SchauHin alerted us to the struggle over visibility, and the potential role that digital platforms may play in shaping this visibility. As we collected further data, and as the #SchauHin campaign progressed, we identified commonalities and differences in how users were enacting various technological features. These patterns prompted us to reflect upon how the features of Titanpad and Twitter impacted the struggle over establishing #SchauHin – and, how they affected the campaign’s broader goal of raising awareness of systemic racism.

To organize our data and emerging insights, we structured key events along a chronological timeline. We, then, examined the content generated on Titanpad and Twitter, mapping it onto the timeline to get a better understanding of how the campaign developed and the actors involved. We also drew on internal documents and media reports to help make sense of the activities and struggles that unfolded.

Once we were confident that we had identified and understood how different platform features and their enactment enabled or hindered the development of the campaign, we sought to gain a deeper understanding of how and why. Our coding and discussions converged upon the importance of visibility, specifically in terms of the perspectives, opinions, and content that supported the campaign and those that detracted or diverted attention away from it. We noted four features, in particular, that actors engaged with to generate or obscure visibility. These included the adding/editing/deleting of content, the use of hashtags, the creation of profiles, and the trending topic algorithm. While the nature and levels of visibility can be somewhat idiosyncratic to the platforms, we focused on broader indications of visibility such as the volume of interactions, as manifested in discussions, tweets, likes, profile follows, as well as the trending of messages. We then examined how visibility shaped discursive openness and closure by foregrounding particular perspectives and positions, while silencing or subordinating others.


The emergence and development of #SchauHin was marked by an ongoing struggle between supporters of the campaign and counter-actors who actively tried to prevent and disrupt mobilization efforts. Central to this struggle was the visibility of communicated content – an affordance that was differentially appropriated by users to enable, facilitate, or hinder the development of the campaign. As supporters tried to generate visibility and open up discourse around daily racism, counter-actors sought to hinder such efforts by obscuring content and enacting discursive closure. Below, we begin with a short overview of how the campaign started. We, then, describe how four digital platform features were differentially used by each group of actors to accomplish divergent aims. We highlight, in particular, how the interplay between different technological features and their contextual use shaped the struggle around visibility and invisibility.

Initiating the Campaign

The idea for the #SchauHin campaign emerged during a conference at the Friedrich Ebert Foundation in Berlin on September 2, 2013. Activists, bloggers, and journalists came together to discuss topics such as blogging about sexism and racism, the role of the mass media, and the differences between the mass media, social media, and the blogosphere. One central theme that repeatedly emerged was the lack of visibility of stories and experiences from people confronting racism. One panelist suggested creating a hashtag to start a conversation and allow people to share their experiences of daily racism:

Can I make a suggestion first? The issue is racism and sexism. This is actually the ultimate opportunity, where these different blogospheres on the internet have possibly just come together, where probably people from both areas and even more are watching the livestream. Maybe in the livestream you can discuss what kind of hashtag could be used for everyday racism as a topic. And “everyday racism” is too long, so something shorter please. (Panel discussion “Rassismus & Sexismus ab_bloggen” (blog_away racism and sexism))

Conference participants took up this call and began enlisting people to help find an appropriate and catchy name for the hashtag, which could be used to draw attention to systemic racism in day-to-day encounters:

Looking for a hashtag for everyday racism. Got ideas? #abbloggen ((@User1) September 2, 2013)

The @User2 is looking for a Twitter hashtag to flag up everyday racism. Any ideas? #abbloggen ((@User3) September 2, 2013)

Within four days after the conference, people had tweeted multiple suggestions including #MeinSchland (MyGermany), #keinRassistaber (notaRacistbut) and #rausschrei (outcry). Below are a few examples of how people engaged in the call to find a hashtag:

@User2 @User4 #meinschland and #rausschrei are the ones I like best. #keinRassistaber is also good, but a bit too long. ((@User5) September 6, 2013)

The #-everyday racism suggestions included: #allrass #DeinRassismus #zumausderHautfahren #AFD #keinRassistaber. What do you think of #meinschland? ((@User2) September 6, 2013)

As more and more people began participating in the search for a hashtag, organizers made the decision to move the conversation to the open platform Titanpad. As a web editor, Titanpad provided a way to make views and information visible through written exchange. This effectively enabled more in-depth discussions and engagement. Organizers announced the switch to Titanpad in a tweet:

The search for a hashtag for everyday racism in Germany continues. Here: Ideas? ((@User2) September 6, 2013)

The move to Titanpad marked the beginning of the planning phase of the campaign, as organizers sought to generate visibility for it and open up discourse. Once the planning phase was complete, the organizers launched the campaign by moving to Twitter. Each of these two platforms provided different technological features, which were differentially used by supporters and counter-actors. Table 1 provides an overview of the technological features and summarizes how they were activated to accomplish divergent ends.

Table 1.

Technological Features, Practices and Implications for Visibility.

Technological Features That Impact Visibilization Process Organizer/Supporter Practices Counter-Actor Practices
Adding, editing, and deleting content (user-driven) Generating visibility by creating content, sharing ideas, and coordinating activity. For example, voting, posting, commenting. Obscuring and distorting visibility by spamming or adding off-topic content, posting derogatory or antagonistic comments, deleting previously established content. For example, sexist and racist slurs.
Hashtagging (user-driven) Generating and amplifying visibility by structuring and collating content to facilitate search function and content dissemination. For example, using a hashtag and creating a hashtag campaign. Obscuring visibility by “hijacking” the hashtag to create confusion and misinformation. For example, using the hashtag in association with different (typically vague or opposing) content.
Creating a profile (user-driven) Generating and focusing visibility by creating a “go-to” place to post and find information (profile owner or administrators control the content) Obscuring and diverting visibility by creating a similar profile in terms of style and name to divert attention away from original content
Trending (algorithm-driven) Generating, amplifying, and focusing visibility (intentional – by encouraging more Tweets) Aimed at obscuring content, yet amplifying and focusing visibility (unintentional – by tweeting to divert visibility)

Feature 1: Adding, Editing, and Deleting Content

The Titanpad platform allowed users to add, edit, and delete content – however, this feature could be used for fundamentally different purposes. Whereas organizers and supporters used it to generate visibility for the campaign and its goal of ending systemic racism, counter-actors used it to hinder such efforts. Specifically, the adding, editing, and deleting features of Titanpad were used, on the one hand, to aggregate ideas and voices – generating visibility for the outcomes of such collaborative efforts. Yet, on the other hand, they were also used to distort and alter content in ways that created confusion and obscured visibility.

Generating Visibility and Discursive Openness

Because the Titanpad link could be shared openly, it created an opportunity for people to join the conversation. Anyone with the link could comment, add suggestions, and edit or delete content. With the move to Titanpad, there were more coordinated efforts to come up with a hashtag. Several additional hashtags were proposed and discussed – e.g., #auf180, #SchauHin, #jederfremd, or #rausschrei. After each proposed hashtag, users were free to add comments and respond to others’ comments. Below is an example of one such exchange that took place on September 6, 2013:

“auf180+1” is an interesting suggestion, I think! [editorial note: in German “auf180” means that a person is at 180 (degrees), i.e., boiling, furious.] Short, succinct, symbolizes the anger, the rage associated with everyday racism. +1! thanks, just occurred to me because I often feel that way about this topic. Ilikealot!+1 +1 is about the anger you feel? I think that is connected to it, but it shouldn’t be in the foreground. It’s more about the injustice that is connected to racism –> injustice? Auf180 shows a reaction, a feeling – this includes the injustice, the grief and all that, but it is the result, not the cause? Well, it is not absolutely necessary for the hashtag to describe the cause, is it? It is quite powerful when the hashtag symbolizes: This happens every damn day, this is reality, this makes us sad, angry – and: This is unfair. Schaut hin – open your eyes. Apropos: #Schauhin would also be a good suggestion:) You save two characters with Auf180 to describe the incident compared to Schauhin The only problem: It doesn’t mention racism but still good, I find it somehow ‘more exciting’ > why are you at 180? > read on, eye-opener

I would prefer #Schauhin,1 because it contains a request to open your eyes. I find that great! +1 even better if we had something with activity #TuWasDagegen [editorial note: do something about it] is quite long Schauhin is concise, short and not a direct attack but pointing out. great! +1 oh well, I also think Schauhin is great! active! challenging! and it makes the problem so clear, because people always just close their eyes when it comes to everyday racism. and “just open your eyes” is something I often use in the context of racism/sexism! Yes, SchauHin is actually not that bad. I’m torn between #Auf180 and #SchauHin#Auf180 would mean anger and means that you don’t want to accept it. A little resistance. A little more aggressive.

#SchauHin I like even better.

- abblocken. inspired by the event “abbloggen,” because the aufschrei hashtag [editorial note: #outcry, referring to sexism] doesn’t mean that it’s about sexism and was quite clear. +1

- Rausschrei – pro: Strong +contra: Too close to Aufschrei/Another thought: The combination of the R of racism + Aufschrei) is too close to the “raus” (out) in “Ausländer raus” (foreigners out), right, I did not consciously realize that. scratch scratch

- Maybe search for Reinschrei completely independent of aufschrei? Otherwise the trolls will come immediately and it will be the same discussion as with other words, wouldn’t it be? Trolls will come anyway, but the connection to aufschrei is not obvious to me, does not have to be here, definitely attracts them faster … my concern is that the hashtag dies right at the beginning (it doesn’t last long enough because of aufschrei. sorry)s

- Diversity perhaps? As a challenge to the understanding of integration as assimilation?

As the above exchange illustrates, there were lively debates about the pros and cons of different terms and their potential to be adopted by others to generate visibility for the campaign. After the discussion, the organizers decided to conduct a vote on the hashtag names proposed. Users were instructed to vote by typing a “+1” after the suggested hashtag that they liked most. The proposed hashtag #SchauHin received the most votes and was therefore selected as the name for the campaign. An excerpt from September 6, 2013, shows the call for votes, and the report of the final results:

Dear all,

Collect hashtag suggestions for everyday racism here, evaluate, and decide quickly:)

If “scratch” is written THREE TIMES after a word, then we drop it.

I’ll copy favorites to the top, less discussed ones to the bottom.

Deadline: 3.45 PM (German time). Otherwise things will get out of hand:) Soo, we have enough suggestions now. I’ll list the top suggestions (you are welcome to help me) and with a +1 you can mark your agreement (no comments, the comments can be inserted below):

The voting ends at 3.55 (4 PM is tooo late):

- Abblocken +1+1

- Rausschrei +1

- Auf180 +1+1+1+1+1 +1+1+1+1+1

- AllRass +1+1+1+1

- SchauHin +1+1+1+1+1+1+1+1+1+1+1

- Rassismus247+1

- Tagesrassismus+1

Obscuring Content and Facilitating Discursive Closure

When the discussion on Titanpad moved to the subject of when the hashtag should be launched, trolls gained access and began hindering coordination by adding off-topic content as well as nonsensical, derogatory or antagonistic comments (spam). The following is one example of such trolling content – which involved making racist, antisemitic and sexist remarks:

penis hahan: DDDDDDDDDD

hello where isd the acction against natzis? xDD: DDDDDDDDDDDDD

t. Spurdo Spöhnke

snibeda snab: DDDDDDDD9gag army was here

  • :DDDD

  • fug: D:D:D:D:D

  • What is this about?


  • Everyday racism is nicecreated by Jews. You have to know!

  • I have enough books here.

  • +My name is [name], I was always waiting for Krautchannel, PENIS VAGINAL-STEEP LOL. I am 13 and would like to have intercourse with Overageguys (HOOKERS KIDS KNOWN NOTHING OF MY SEXUAL COMPLIMENASd

  • Hail Lucke!

  • NAZIS here!

  • SAW

  • SAW

  • SAW

  • SAW

  • SAW

  • SAW

  • SAW

  • SAWd

These trolls were counter-actors, in that they participated by creating confusion and diverting attention to drown out or silence voices. Such destructive activities were afforded by the open editing function of the Titanpad platform. The organizers of the campaign tried to manage trolls by deleting their content, warning users, and refocusing the discussion. Fig. 1 provides a screenshot of such efforts, showing a highlighted section with the comment: “Nazi propaganda was deleted here” (added rounded rectangle 1). However, this was later followed by additional derogatory and insulting comments.

Fig. 1. Titanpad Screenshot (Rectangles Added).

Fig. 1.

Titanpad Screenshot (Rectangles Added).

The right side of Fig. 1 shows a chat in which organizers and supporters openly discussed how to manage trolls (added rounded rectangle 2). One user asked whether “Everyone can delete everything that OTHERS are writing?” (added rounded rectangle 3) and received an affirmative response – thus illustrating how Titanpad’s features for adding, editing, and deleting afforded discursive closure and silencing.

In light of the challenges of managing trolls and the difficulty of agreeing upon a launch date for the hashtag, some users suggested to just go ahead – as the timing was not that important. Organizers agreed with the suggestion and launched the hashtag on Twitter without waiting for the final results of the vote.

Feature 2: Hashtagging

The Twitter feature of hashtagging enables users to categorize content and conversations under a linguistic marker. This feature effected visibilization in very different ways. On the one hand, it was appropriated by supporters to increase the visibility of racist norms, beliefs, and practices – which could now be grouped and amalgamated under the hashtag #SchauHin. On the other hand, it was appropriated by counter-actors to obscure visibility through the misappropriation of the hashtag in an attempt to redirect content and silence anti-racist discourse (i.e., discursive closure).

Generating Visibility and Discursive Openness

The hashtag #SchauHin was publicized on Twitter in early September 2013, along with a call for people to share their experiences of racism in their daily lives:

And the hashtag for (or rather against) everyday racism saw the light of day at 3.55 PM: #SchauHin ((@User2) September 6, 2013)

The hashtag was immediately picked up, as users began to share their experiences of micro-racism in day-to-day encounters. Table 2 provides examples of some of the experiences that were shared in the tweets. Users tweeted about a variety of personal experiences – be they in the workplace, schools, or universities, or during encounters with strangers, government agencies, or real estate agents – making visible the systemic nature of these various acts. By providing an umbrella term and a way to bring together and amalgamate content, the hashtag opened the discourse and provided supporters an opportunity to amplify the visibility of daily racism.

Table 2.

Selection of #SchauHin Tweets.

Topic Tweets
Work environment Job: I call and give my name. Sorry, job’s gone, they say. German friend calls, job’s still available, interview too. #SchauHin – (@User6) September 6, 2013
#schauhin, if at a job interview the topic is honor killings and forced marriage and not your qualifications! – (@User7) September 6, 2013
Public agencies When the official at the asylum office (!) calls Afghan refugees “Taliban rabble.” #SchauHin – (@User8) September 6, 2013
Girlfriend (Italian citizen born in GER) at a public agency: Employee speaks to her: CAN … YOU … UNDERSTAND … ME …? #SchauHin – (@User9) September 6, 2013
Police racial profiling The constant police checks at Munich Central Station, with no grounds for suspicion. Never German looking men. #SchauHin – (@User10) September 6, 2013
“You could be an illegal,” a policeman said to me for the 10th time on the train. #racialprofiling #SchauHin – (@User8) September 6, 2013
Schools and universities A teacher told a classmate of Turkish descent who was chatting in class: “You are a guest in this country, so behave yourself.” #SchauHin – (@User11) September 6, 2013
Winter. A friend wants to borrow my gloves for a short time. Teacher: “No, she needs them herself, it’s colder here than in Africa.” #SchauHin – (@User12) September 6, 2013
Public debate When the media features people saying that the racist murders are the fault of the migrants themselves. #SchauHin – (@User13) September 6, 2013
When a friend on FB shares an NPD poster and defends herself by saying that she is against racists, but the “content” is right. #SchauHin – (@User14) September 6, 2013
Housing market A friend of mine didn’t get an appointment to view an apartment until he gave “Becker” (name of the girlfriend) on the phone. #SchauHin – (@User15) September 6, 2013
When the landlord rejects an American because he would never be able to get along in her house as a “black.” #SchauHin – (@User16) September 6, 2013
Public setting 12-year-old me on my bike: “ring ring.” Pedestrian turns around and back again. And says loudly: - For something like this I will not step aside. #SchauHin – (@User17) September 6, 2013
Sentences that start with “I have nothing against you but …” #SchauHin – (@User18) September 6, 2013

Obscuring content and facilitating discursive closure

Similar to what happened on Titanpad, counter-actors engaged in disruptive efforts to hinder the campaign and its goal of drawing attention to systemic racism. Counter-actors misappropriated the hashtag, using it in association with racist tweets and content. In the organizers’ internal documents and in media reports, these were referred to as attempts to “hijack the hashtag” (e.g., Meissner, 2014). For instance, in a news article about far-right extremism and social media, Nasman (2015) notes: “Well, it seems far-right groups have begun hijacking hashtags and overwhelming the discussion with far-right views. Take the anti-racism hashtag #schauhin, for example.”

Oftentimes, the subversive nature of counter-actors’ tweets was not immediately obvious. For example, they were often ambiguous or phrased in a similar style as the tweets from #SchauHin supporters – pointing out, for example, seemingly negative experiences, personal restrictions, and changes that the tweets’ authors opposed:

“I’m not allowed to see the hair of the headscarf girls. #schauhin”;

“I can’t get my kebab with pork. #schauhin”;

“Haribo is now also available in Halal! #schauhin”;

“I feel marginalized as an NPD voter. #schauhin.”2

By tweeting content that was irrelevant, belittling, and antagonistic to the overarching purpose of the campaign, counter-actors distracted and diverted attention away from the “relevant” and focal content of the campaign. In this way, trolls and their counter-efforts sought to obscure and thus close down anti-racist discourse.

Feature 3: creating a profile

Generating Visibility and Discursive Openness

When Twitter users create a profile, they create a kind of business card, brand, or biography of who they are and what is important to them. The organizers of the #SchauHin campaign created a profile for the movement to explain what the campaign was about, what its goals were, and how people could get involved and engaged. Using the same Twitter profile name and handle as the hashtag #SchauHin, organizers sought to create a “go-to” profile page to further increase visibility and recognition for the campaign. Fig. 2 shows the Twitter profile picture, which prominently features the hashtag.

Fig. 2. Twitter Profile Picture.

Fig. 2.

Twitter Profile Picture.

The profile page was used to tweet, retweet, like, and respond to other tweets with the hashtag #SchauHin – thereby generating and amplifying visibility for the campaign. As people began following the new profile page to stay informed, the profile page provided a way to focus attention and amalgamate a wider range of content relevant to the goal of ending systemic racism. It also provided a link to the #SchauHin website. In this way, the profile page contributed to opening discourse about daily racism.

Obscuring Content and Facilitating Discursive Closure

Counter-actors tried to disrupt the campaign through the creation of profile pages that were similar in name and visual design. One profile, for instance, just added a “2” to the end of the account handle, calling itself @SchauHin2. It used the same logo and a similar color palette as @SchauHin. Fig. 3 provides a screenshot of the profile page, where the text in the added rounded rectangle reads: “Join in: Use the hashtag #SchauHin for all national tweets against a foreign takeover. Let’s create solidarity and unity!

Importantly, while a Twitter handle must be unique, a Twitter name does not have to be. So, while the Twitter profile itself is @SchauHin2, the account’s owners call themselves SchauHin. Again, there is a conflation: the two profiles advocate effectively opposite views while having the same name and looking very similar. Hence, counter-actors used the feature of creating a profile to divert attention away and obscure the original #SchauHin campaign. As Fig. 3 shows, the SchauHin2 Twitter profile gathered a fair amount of attention and involvement – with over 150 followers, and more than 1,200 (re-)tweets with over 6,000 likes.

Fig. 3. SchauHin2 Twitter Profile Screenshot (Rectangle Added).

Fig. 3.

SchauHin2 Twitter Profile Screenshot (Rectangle Added).

Due to these profiles and tweets by counter-actors, the #SchauHin organizers were aware of the need to clearly communicate the meaning of the hashtag and reinforce the goal of the campaign. In an interview, one of the organizers of the #SchauHin campaign explained how the emergence of these “fake” profiles highlighted the significance of the campaign:

The Twitter accounts existed very early on at the beginning of the hashtag (…) And as I said, the racist tweets underscore the point of #SchauHin. How else can this ugly face of our society be demonstrated so clearly? And the zeal of the racists says a lot about these people: They want to prevent a debate on racism at all costs and focus on their own agenda. I think these desperate attempts only show the relevance of this debate. So: No, the campaign has not been subverted and it is not a turning point – these tweets are nothing new. The point of #SchauHin is well known. These tweets only make this debate more important. (Initiator of #SchauHin in an interview with Focus Online (Rohler, 2014))

Feature 4: Trending

Trending is an automated Twitter feature supported by underlying algorithms that draw attention to topics deemed “hot” or that are generating “buzz” within a certain time frame. It is determined by a combination of three criteria: popularity, novelty, and timeliness. By automatically identifying and flagging trending hashtags, Twitter foregrounds these hashtags and increases their visibility – while indirectly backgrounding others. As occurred in the case of hashtags and profile pages, the trending feature impacted visibilization in very different ways, leading to both discursive openness and closure.

Generating Visibility and Discursive Openness

When the organizers launched the campaign, they encouraged supporters to start tweeting under the hashtag – as a way to generate a large number of tweets in a short period of time. #SchauHin became a trending topic in Germany on the day of its initiation and remained on the list for three days. Fig. 4 shows a screenshot of the Twitter trends for Germany. In other words, the Twitter algorithm identified it as one of the most used and discussed hashtags on Twitter in Germany.

Fig. 4. #SchauHin as Trending Topic on Twitter.

Fig. 4.

#SchauHin as Trending Topic on Twitter.

As a trending topic, #SchauHin attracted the attention of several print media outlets in Germany, such as Süddeutsche Zeitung, Tagesspiegel, and Stern. Several articles noted the quantity of tweets in a very short time frame and used this to deduce its significance (e.g., Adeoso, 2013). According to some, the trending of the hashtag provided clear indications of the existence of systemic racism. Visibility was therefore increased to audiences outside Twitter. As one user noted: How can you say that there is no #racism in Germany: the hashtag #schauhin has only existed for 8 hours and already it is the second most frequent tweet” ((@User19) September 6, 2013). The trending topic feature on Twitter thus helped amplify the visibility of the campaign and its goal of contributing to end systemic racism.

Obscuring Content and Facilitating Discursive Closure

Counter-actors’ efforts to obscure the campaign by appropriating the #SchauHin hashtag had the unintended effect of adding to the overall number of tweets that “fed into” Twitter’s trending algorithm. In other words, fake profiles and the content generated by trolls contributed (albeit largely unintentionally) to enhancing the visibility of the #SchauHin campaign. As noted above (see Fig. 4), the @SchauHin2 profile tweeted or retweeted over 1,200 times and liked tweets over 6,000 times in the first year. Thus, on one level, counter-actors’ disruptive efforts generated discursive closure (i.e., they diverted attention, created confusion, and drowned out anti-racist discourse). Yet, on another level, they unintentionally amplified visibility because the attempt to “hijack” the original #SchauHin hashtag paradoxically contributed to making it a trending topic on Twitter in Germany.

Summary: The Struggle for Visibility

Both Titanpad and Twitter were used to plan and execute the #SchauHin campaign. On both platforms, counter-actors who opposed the goal and efforts of drawing attention to everyday acts of racism tried to disrupt #SchauHin. On Titanpad, counter-actors and trolls were very effective in hindering coordination and planning. As soon as they gained access to the open platform, they were free to delete and edit relevant content, as well as add irrelevant, derogatory, or antagonistic content. Moreover, they could do this while remaining fairly anonymous. Such counter-efforts took on a different form on Twitter because of its different features. While counter-actors and trolls were also free to add content, Twitter’s features did not allow them to delete or edit content other than their own. In addition, Twitter allows for a clear attribution of content to specific accounts or Twitter handles. Despite this attribution, however, counter-actors have identified creative ways to mask it – such as, in our case, creating profiles that mirrored the actual #SchauHin account or posting tweets that mimicked aspects of the campaign’s content and styles of argumentation. As our case showed, the struggle between organizers/supporters and counter-actors played out quite differently on Titanpad and Twitter. These differences were, in large part, due to variations in the affordances provided by each platform.

In the following months, the discursive struggle between supporters and counter-actors continued. As counter-actors ramped up their efforts to close and silence anti-racist discourse, #SchauHin organizers planned and then executed a campaign to “reclaim the hashtag” – encouraging Twitter users to again tweet more about their experiences of daily racism (cf. Fig. 5). Thus, the struggle for visibility continued.

Fig. 5. #SchauHin Profile.

Fig. 5.

#SchauHin Profile.


Social media and other digital platforms have fundamentally transformed ways of connecting, collaborating, and mobilizing (Dobusch & Schoeneborn, 2015; Etter & Albu, 2020; Vaast & Kaganer, 2013). They have become sites of interaction, debate, and conflict – allowing individual voices to be aggregated in ways that can simultaneously generate discursive openness and closure. In this article, we sought to understand how digital technology – and platforms in particular – can enable and hinder institutional processes through what we refer to as visibilization, i.e., the enactment of technological features to foreground particular perspectives, positions, and discourses and give voice to them, while silencing or subordinating others.

By adopting an affordance-based lens to the #SchauHin movement, we were able to identify how the contextual use of platform features by human actors impacted the visibility of discourses that sought to draw attention to and counter systemic racism. By examining the struggle between supporters and counter-actors, we highlighted the different implications of the affordance of visibility. We showed how a particular technological feature can be interpreted and used in radically different ways to make content more or less visible. The tweeting feature on Twitter, for example, can be assigned very different meanings and opportunities for action. In our case, supporters of #SchauHin used this feature to generate visibility and discursive openness, whereas counter-actors used the same feature to obscure visibility and fuel discursive closure (Deetz, 1992; Leonardi & Jackson, 2004).

Our empirical case was one of discursive struggle between groups of actors with opposing interests and agendas. One group sought to use the digital platforms to mobilize action against everyday acts of racism by making deeply ingrained practices and behaviors visible. Their efforts, however, were met with resistance from counter-actors, who used the same platforms to divert and hinder mobilization – e.g., by creating content that was off-topic, antagonistic, derogatory, or confusing. By analyzing how actors differentially enacted a variety of technological features, we captured the nuanced ways in which platforms can be strategically used to formulate, disseminate, or obscure content – making visible or invisible the meanings, practices, and structures that underpin institutional arrangements.

Our study contributes to research at the intersection of technology and institutional theory in two ways. First, we contribute to understandings of how technological affordances influence discursive struggles. Concretely, we showed how digital platforms have opened up opportunities and ways for a wide range of actors to gain voice – notably, through enabling an aggregation of individual voices that might otherwise have been marginalized or silenced by more powerful actors. Second, we further our understandings of how technology can enable or hinder institutional processes through the process of visibilization.

Platform Features, Visibility, and Institutional Processes

We contribute to a relational understanding of technological affordances (see also King et al., 2022) – particularly focusing on the affordance of visibility and its implications for institutional processes. As digital platforms have become “essential infrastructures” for collaborating and organizing (Bohn, Friederici, & Gümüsay, 2020; see also Friederici, Meier, & Gümüsay, 2020; Logue & Grimes, 2021), they are important in the toolkit of institutional entrepreneurs and those who seek to mobilize power or resources to shape institutions (Maguire, Hardy, & Lawrence, 2004). Using the #SchauHin case, we illustrate how platform users – by strategically selecting what they showed and how they showed it – were able to instrumentally influence mobilization and the aggregation of voice (Clemente & Roulet, 2015; Etter & Vestergaard, 2015).

Different features of platforms can significantly impact institutional arrangements. In this way, insights from our study speak to recent calls to examine and theorize the interplay between technology and (de-)institutionalization (Logue & Grimes, 2021; van Rijmenam & Logue, 2020). Our findings illustrate how this interplay is not only shaped by technological affordances but also by unintended consequences that may be rooted in platform features. In our case, counter-actors sought to impede the #SchauHin campaign and its goal of deinstitutionalizing racism. The technological feature of hashtagging affords various actors with contradicting interests to foreground their interpretation and suppress that of others of a particular issue of contestation (e.g., daily racism). Our analysis has shown how this process is enacted in a complex and dynamic socio-material process, whereby the visibilization and obscuring of content contributes to (de-)institutionalizing processes. In our case, however, the efforts to enact technological features to divert attention and sow confusion through counter-activists had partially the opposite effect. Specifically, their addition of content (despite being racist, antisemitic and sexist, nonsensical, etc.) paradoxically increased the visibility of the campaign and supported its efforts toward deinstitutionalization by contributing to the overall number of tweets with the #SchauHin hashtag. This phenomenon resonates with research at the intersection of institutional theory and paradox studies (Gümüsay, Smets, & Morris, 2020; Smith & Tracey, 2016), showing how technological affordances can generate paradoxical outcomes depending on actors’ attempts to effect institutions. Relatedly, there is a possible socio-technical paradox that may be explored at the intersection of the open culture sought by activists and the discursive closure sought by destructive trolls. Certain platform features may encourage open organizing (Dobusch & Schoeneborn, 2015), which is the very factor that attracts practices that effectively hijack such openness.

Digital platform features offer actors a wide variety of opportunities that may have important implications for maintaining or disrupting institutional arrangements (Logue & Grimes, 2021). For example, hashtagging on Twitter provides a way to categorize and amalgamate content in ways that amplify visibility and voice. It enables individuals or marginalized actors to raise awareness of endemic problems and collectively mobilize against highly institutionalized practices and beliefs. However, hashtags may be vulnerable to being “hijacked” by counter-actors who seek to disrupt mobilization efforts (Albu & Etter, 2016) and maintain institutional arrangements. An understanding that technological features are not objective things-in-themselves but rather “things for us to use” may enable scholars to appreciate how potential struggles play out through the way technological features are enacted. A natural corollary of this is that platforms cannot be examined in isolation from the way that actors mobilize them.

Thus, an important implication is that social media and other digital platforms are reshaping power dynamics in significant ways – not only by giving voice to peripheral actors but also by making the practices and actions of powerful actors subject to widespread scrutiny (Etter & Vestergaard, 2015; Gillespie, 2010, 2018). Power and institutions have been a central and recurring theme in institutional research (Lawrence & Buchanan, 2017), and we can expect social media to play an important role in altering power relationships between individuals, groups, and organizations (Etter et al., 2019). Future research could further unpack how digital platforms might affect institutional processes differently compared to mobilization and coordination efforts that take place physically or face to face – i.e., where connectivity and interaction are more limited by temporal and spatial constraints. In addition, future research could examine further the entanglement of digital and analog domains around institutions (Gümüsay & Smets, 2020).

Visibility Struggles and Technological Affordances

Our case showed how the interplay between material features and contextual use by supporters and counter-actors influenced the visibility of content in ways that generated both discursive openness and closure. On the Titanpad platform, for example, actors could add and delete specific content to open up or close down discourse by aligning or misaligning it with a particular perspective, position, or stance. On Twitter, supporters of #SchauHin generated and amalgamated content to fuel discursive openness, while counter-actors generated opposing or confusing content to generate discursive closure. In this way, platforms constitute social spaces where actors might engage in a struggle around meaning-making (Albu & Etter, 2016; Etter & Albu, 2020).

Yet, at the same time, platforms themselves have agency with regard to what they make visible or invisible (Leonardi, 2012). Although users actively appropriate and adapt platform technologies for their particular interests and agendas, the properties and architectures of these platforms also shape content and usage (Costa, 2018). They may even do so in ways that implicitly support practices like trolling and harassment (Massanari, 2017). Twitter and Facebook, for example, have been criticized for fueling ideological polarization (Dylko et al., 2017), disinformation (Tucker et al., 2018), and filter bubbles or echo chambers (Pariser, 2011) that decrease the likelihood of encountering ideologically cross-cutting content. In response to such criticisms, there have been attempts to alter certain aspects or features of digital platforms. An illustrative example would be Twitter’s move to flag tweets with warnings and public interest notices – most notably, flagging several of former US President Donald Trump’s tweets for “glorifying violence.” By flagging a tweet, Twitter requires users to take an extra step of clicking a “view” button to access the tweet. Moreover, users are restricted from directly retweeting or “liking” the tweet. This move by Twitter has generated intense debate and mixed responses especially in Silicon Valley. Whereas some in the tech industry praised this feature, others cautioned that such interventions move digital platforms into the sphere of political activism and influence. Thus, despite some platforms’ claims of being apolitical, they are rarely ever neutral (Costa, 2018; Gümüsay & Reinecke, 2022) – as their features are often based on opaque algorithmic systems of content moderation and user governance designed to orchestrate relationships in favor of advertisers or competent manipulators (Gillespie, 2010, 2018).


Digital platforms provide infrastructures for expansive and immediate connectivity. They have become arenas of interaction that facilitate, regulate, and shape communication between ever-shifting coalitions that form and dissolve around each issue (van Dijck, 2013). Using the case of #SchauHin, we have shown how technology – and digital platforms in particular – can influence discursive struggles and contestation around highly institutionalized practices, beliefs, and behaviors. Our study thus joins the call for a better appreciation of how digital technology interacts with institutions and how it can fundamentally transform ways of mobilizing to effect change (Hinings et al., 2018; see also Berente & Seidel, 2022; Gurses, Yakis-Douglas, & Özcan, 2022; Jarvis, Eden, Wright, & Burton-Jones, 2022; Schildt, 2022). It does so by underscoring the importance of and the struggle around generating visibility and by disentangling how actors’ contextual intentions and use of digital technological features enable or undermine processes of visibilization.



SchauHin translates literally to “look there” or more colloquially to “open your eyes.”


The NPD is an extreme far-right party in Germany.


Adeoso, 2013Adeoso, M.-S. (2013). #SchauHin ist der neue #Aufschrei. Retrieved from

Albu, 2019Albu, O. B. (2019). Dis/ordering: The use of information and communication technologies by human rights civil society organizations. In V. Consuelo & T. Kuhn (Eds.), Dis/organization as communication: Exploring the disordering, disruptive and chaotic properties of communication (pp. 151171). London: Routledge.

Albu, & Etter, 2016Albu, O. B., & Etter, M. (2016). Hypertextuality and social media: A study of the constitutive and paradoxical implications of organizational Twitter use. Management Communication Quarterly, 30(1), 531.

Altheide, 2013Altheide, D. L. (2013). Media logic, social control, and fear. Communication Theory, 23(3), 223238.

Barberá-Tomás, Castelló, de Bakker, & Zietsma, 2019Barberá-Tomás, D., Castelló, I., de Bakker, F. G. A., & Zietsma, C. (2019). Energizing through visuals: How social entrepreneurs use emotion-symbolic work for social change. Academy of Management Journal, 62(6), 17891817.

Bennett, & Segerberg, 2012Bennett, W. L., & Segerberg, A. (2012). The logic of connective action. Information, Communication & Society, 15(5), 739768.

Bohn, Friederici, & Gümüsay, 2020Bohn, S., Friederici, N., & Gümüsay, A. A. (2020). Too big to fail us? Platforms as systemically relevant. Internet Policy Review. Retrieved from

Boxenbaum, Jones, Meyer, & Svejenova, 2018Boxenbaum, E., Jones, C., Meyer, R. E., & Svejenova, S. (2018). Towards an articulation of the material and visual turn in organization studies. Organization Studies, 39(5–6), 597616.

Breuer, Landman, & Farquhar, 2015Breuer, A., Landman, T., & Farquhar, D. (2015). Social media and protest mobilization: Evidence from the Tunisian revolution. Democratization, 22(4), 764792.

Brighenti, 2007Brighenti, A. (2007). Visibility: A category for the social sciences. Current Sociology, 55(3), 323342.

Bucher, 2012Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14(7), 11641180.

Castelló, Etter, & Nielsen, 2016Castelló, I., Etter, M., & Nielsen, F. Å. (2016). Strategies of legitimacy through social media: The networked strategy. Journal of Management Studies, 53(3), 402432.

Castells, 1998Castells, M. (1998). The rise of the network society (Reprinted). Oxford: Blackwell.

Chandler, & Hwang, 2015Chandler, D., & Hwang, H. (2015). Learning from learning theory: A model of organizational adoption strategies at the microfoundations of institutional theory. Journal of Management, 41(5), 14461476.

Chen, & Wei, 2019Chen, X., & Wei, S. (2019). Enterprise social media use and overload: A curvilinear relationship. Journal of Information Technology, 34(1), 2238.

Clemente, & Roulet, 2015Clemente, M., & Roulet, T. J. (2015). Public opinion as a source of deinstitutionalization: A “spiral of silence” approach. Academy of Management Review, 40(1), 96114.

Cook, Schaafsma, & Antheunis, 2018Cook, C., Schaafsma, J., & Antheunis, M. (2018). Under the bridge: An in-depth examination of online trolling in the gaming context. New Media & Society, 20(9), 33233340.

Coretti, & Pica, 2018Coretti, L., & Pica, D. (2018). Facebook’s communication protocols, algorithmic filters, and protest. In M. Mortensen, C. Neumayer, & T. Poell (Eds.), Social media materialities and protest: Critical reflections (pp. 7285). London: Routledge.

Cornelissen, Durand, Fiss, Lammers, & Vaara, 2015Cornelissen, J. P., Durand, R., Fiss, P. C., Lammers, J. C., & Vaara, E. (2015). Putting communication front and center in institutional theory and analysis. Academy of Management Review, 40(1), 1027.

Costa, 2018Costa, E. (2018). Affordances-in-practice: An ethnographic critique of social media logic and context collapse. New Media & Society, 20(10), 36413656.

Dacin, & Dacin, 2008Dacin, M. T., & Dacin, P. A. (2008). Traditions as institutionalized practice: Implications for deinstitutionalization. In R. Greenwood, C. Oliver, R. Suddaby, & K. Sahlin-Andersson (Eds.), The SAGE handbook of organizational institutionalism (pp. 327351). London: SAGE.

Daudigeos, Roulet, & Valiorgue, 2020Daudigeos, T., Roulet, T., & Valiorgue, B. (2020). How scandals act as catalysts of fringe stakeholders’ contentious actions against multinational corporations. Business & Society, 59(3), 387418.

Deephouse, & Carter, 2005Deephouse, D. L., & Carter, S. M. (2005). An examination of differences between organizational legitimacy and organizational reputation*. Journal of Management Studies, 42(2), 329360.

Deetz, 1992Deetz, S. A. (1992). Democracy in an age of corporate colonization developments in communication and the politics of everyday life. Albany, NY: State University of New York.

den Hond, & de Bakker, 2007den Hond, F., & de Bakker, F. G. A. (2007). Ideologically motivated activism: How activist groups influence corporate social change activities. Academy of Management Review, 32(3), 901924.

Dobusch, & Schoeneborn, 2015Dobusch, L., & Schoeneborn, D. (2015). Fluidity, identity, and organizationality: The communicative constitution of anonymous. Journal of Management Studies, 52(8), 10051035.

Dylko, Dolgov, Hoffman, Eckhart, Molina, & Aaziz, 2017Dylko, I., Dolgov, I., Hoffman, W., Eckhart, N., Molina, M., & Aaziz, O. (2017). The dark side of technology: An experimental investigation of the influence of customizability technology on online political selective exposure. Computers in Human Behavior, 73, 181190.

Eisenhardt, 1989Eisenhardt, K. M. (1989). Building theories from case study research. Academy of Management Review, 14(4), 532550.

Etter, & Albu, 2020Etter, M., & Albu, O. (2020). Activists in the dark. Social media algorithms and collective action. Organization, 28, 6891.

Etter, Colleoni, Illia, Meggiorin, & D’Eugenio, 2018Etter, M., Colleoni, E., Illia, L., Meggiorin, K., & D’Eugenio, A. (2018). Measuring Organizational Legitimacy in Social Media: Assessing Citizens’ Judgments With Sentiment Analysis. Business & Society, 57(1), 6097.

Etter, Ravasi, & Colleoni, 2019Etter, M., Ravasi, D., & Colleoni, E. (2019). Social Media and the Formation of Organizational Reputation. Academy of Management Review, 44(1), 2852.

Etter, & Vestergaard, 2015Etter, M., & Vestergaard, A. (2015). Facebook and the public framing of a corporate crisis. Corporate Communications: An International Journal, 20(2), 163177.

Faraj, & Azad, 2012Faraj, S., & Azad, B. (2012). The materiality of technology: An affordance perspective. In P. M. Leonardi, B. A. Nardi, & J. Kallinikos (Eds.), Materiality and organizing: Social interaction in a technological world (pp. 237258). Oxford: Oxford University Press.

Flyvbjerg, 2006Flyvbjerg, B. (2006). Five misunderstandings about case-study research. Qualitative Inquiry, 12(2), 219245.

Flyverbom, Leonardi, Stohl, & Stohl, 2016Flyverbom, M., Leonardi, P., Stohl, C., & Stohl, M. (2016). The management of visibilities in the Digital Age – Introduction. International Journal of Communication, 10(0), 98109.

Friederici, Meier, & Gümüsay, 2020Friederici, N., Meier, P., & Gümüsay, A. A. (2020). An opportunity for inclusion? Digital platform innovation in times of crisis. Pioneers Post. Retrieved from

Gantt Shafer, 2017Gantt Shafer, J. (2017). Donald Trump’s “political incorrectness”: Neoliberalism as frontstage racism on social media. Social Media + Society, 3(3), 2056305117733226.

Gegenhuber, & Naderer, 2019Gegenhuber, T., & Naderer, S. (2019). When the petting zoo spawns into monsters: Open dialogue and a venture’s legitimacy quest in crowdfunding. Innovation, 21(1), 151186.

Gillespie, 2010Gillespie, T. (2010). The politics of ‘platforms.’ New Media & Society, 12(3), 347364.

Gillespie, 2014Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (Vol. 167, pp. 167193). Cambridge, Massachusetts & London, England: The MIT Press.

Gillespie, 2018Gillespie, T. (2018). Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. London: Yale University Press.

Glaser, Pollock, & D’Adderio, 2021Glaser, V. L., Pollock, N., & D’Adderio, L. (2021). The biography of an algorithm: Performing algorithmic technologies in organizations. Organization Theory, 2(2), 26317877211004610.

Gray, Purdy, & Ansari, 2015Gray, B., Purdy, J. M., & Ansari, S. (Shaz). (2015). From interactions to institutions: Microprocesses of framing and mechanisms for the structuring of institutional fields. Academy of Management Review, 40(1), 115143.

Gümüsay, & Reinecke, 2022Gümüsay, A. A., & Reinecke, J. (2022). Researching for desirable futures: From real utopias to imagining alternatives. Journal of Management Studies, 59(1), 236242.

Gümüsay, & Smets, 2020Gümüsay, A. A., & Smets, M. (2020). New hybrid forms and their liability of novelty. In M. L. Besharov & B. C. Mitzinneck (Eds.), Organizational hybridity: Perspectives, processes, promises – Research in the sociology of organizations (Vol. 69, pp. 167187). Bingley: Emerald Publishing Limited.

Gümüsay, Smets, & Morris, 2020Gümüsay, A. A., Smets, M., & Morris, T. (2020). “God at work”: Engaging central and incompatible institutional logics through elastic hybridity. Academy of Management Journal, 63(1), 124154.

Hansen, & Flyverbom, 2015Hansen, H. K., & Flyverbom, M. (2015). The politics of transparency and the calibration of knowledge in the digital age. Organization, 22(6), 872889.

Harmon, 2019Harmon, D. J. (2019). When the fed speaks: Arguments, emotions, and the microfoundations of institutions. Administrative Science Quarterly, 64(3), 542575.

Heavey, Simsek, Kyprianou, & Risius, 2020Heavey, C., Simsek, Z., Kyprianou, C., & Risius, M. (2020). How do strategic leaders engage with social media? A theoretical framework for research and practice. Strategic Management Journal, 41(8), 14901527.

Hinings, Gegenhuber, & Greenwood, 2018Hinings, B., Gegenhuber, T., & Greenwood, R. (2018). Digital innovation and transformation: An institutional perspective. Information and Organization, 28(1), 5261.

Hoffman, 1999Hoffman, A. J. (1999). Institutional evolution and change: Environmentalism and the U.S. chemical industry. Academy of Management Journal, 42(4), 351371.

Hudson, Okhuysen, & Creed, 2015Hudson, B. A., Okhuysen, G. A., & Creed, W. E. D. (2015). Power and institutions: Stones in the road and some yellow bricks. Journal of Management Inquiry, 24(3), 233238.

Jones, Boxenbaum, & Anthony, 2013Jones, C., Boxenbaum, E., & Anthony, C. (2013). The immateriality of material practices in institutional logics. In M. D. Lounsbury & E. Boxenbaum (Eds.), Institutional logics in action, Part A (Vol. 39, pp. 5175). Bingley: Emerald Publishing Limited.

Latour, 1996Latour, B. (1996). On actor-network theory: A few clarifications. Soziale Welt, 47(4), 369381.

Lawrence, & Buchanan, 2017Lawrence, T. B., & Buchanan, S. (2017). Power, institutions and organizations. In R. Greenwood, C. Oliver, T. B. Lawrence, & R. E. Meyer (Eds.), The SAGE handbook of organizational institutionalism (pp. 477506). London: SAGE.

Leonardi, 2011Leonardi, P. M. (2011). When flexible routines meet flexible technologies: Affordance, constraint, and the imbrication of human and material agencies. MIS Quarterly, 35(1), 147167.

Leonardi, 2012Leonardi, P. M. (2012). Materiality, sociomateriality, and socio-technical systems: What do these terms mean? How are they related? Do we need them? In P. M. Leonardi, B. A. Nardi, & J. Kallinikos (Eds.), Materiality and organizing: Social interaction in a technological world (pp. 2548). Oxford: OUP.

Leonardi, 2014Leonardi, P. M. (2014). Social media, knowledge sharing, and innovation: Toward a theory of communication visibility. Information Systems Research, 25(4), 796816.

Leonardi, Huysman, & Steinfield, 2013Leonardi, P. M., Huysman, M., & Steinfield, C. (2013). Enterprise social media: Definition, history, and prospects for the study of social technologies in organizations. Journal of Computer-Mediated Communication, 19(1), 119.

Leonardi, & Jackson, 2004Leonardi, P. M., & Jackson, M. H. (2004). Technological determinism and discursive closure in organizational mergers. Journal of Organizational Change Management, 17(6), 615631.

Leonardi, & Vaast, 2017Leonardi, P. M., & Vaast, E. (2017). Social media and their affordances for organizing: A review and agenda for research. Academy of Management Annals, 11(1), 150188.

Logue, & Grimes, 2021Logue, D., & Grimes, M. (2021). Platforms for the people: Enabling civic crowdfunding through the cultivation of institutional infrastructure. Strategic Management Journal, 43(3), 663693.

Madsen, 2016Madsen, V. T. (2016). Constructing organizational identity on internal social media: A case study of coworker communication in Jyske Bank. International Journal of Business Communication, 53(2), 200223.

Maguire, & Hardy, 2009Maguire, S., & Hardy, C. (2009). Discourse and deinstitutionalization: The decline of DDT. Academy of Management Journal, 52(1), 148178.

Maguire, Hardy, & Lawrence, 2004Maguire, S., Hardy, C., & Lawrence, T. B. (2004). Institutional entrepreneurship in emerging fields: HIV/AIDS treatment advocacy in Canada. Academy of Management Journal, 47(5), 657679.

Mair, & Martí, 2009Mair, J., & Martí, I. (2009). Entrepreneurship in and around institutional voids: A case study from Bangladesh. Journal of Business Venturing, 24(5), 419435.

Massanari, 2017Massanari, A. (2017). #Gamergate and the Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329346.

Matamoros-Fernández, 2017Matamoros-Fernández, A. (2017). Platformed racism: The mediation and circulation of an Australian race-based controversy on Twitter, Facebook and YouTube. Information, Communication & Society, 20(6), 930946.

Meissner, 2014Meissner, M. (2014). #SchauHin – Gegen Rassisten und Hashtag-Räuber. Retrieved from

Meyer, Jancsary, Höllerer, & Boxenbaum, 2018Meyer, R. E., Jancsary, D., Höllerer, M. A., & Boxenbaum, E. (2018). The role of verbal and visual text in the process of institutionalization. Academy of Management Review, 43(3), 392418.

Milan, 2015Milan, S. (2015). When algorithms shape collective action: Social media and the dynamics of cloud protesting. Social Media + Society, 1(2), 110.

Monteiro, & Nicolini, 2015Monteiro, P., & Nicolini, D. (2015). Recovering materiality in institutional work: Prizes as an assemblage of human and material entities. Journal of Management Inquiry, 24(1), 6181.

Nasman, 2015Nasman, C. (2015). Far-right extremists in Germany turn to social media to spread their ideas. Deutsche Welle. Retrieved from

Neumayer, & Rossi, 2016Neumayer, C., & Rossi, L. (2016). 15 Years of protest and media technologies scholarship: A sociotechnical timeline. Social Media + Society, 2(3), 2056305116662180.

Ouellette, & Banet-Weiser, 2018Ouellette, L., & Banet-Weiser, S. (2018). Special issue: Media and the extreme right editor’s introduction. Communication, Culture and Critique, 11(1), 16.

Pariser, 2011Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. New York, NY: Penguin Press.

Phillips, & Oswick, 2012Phillips, N., & Oswick, C. (2012). Organizational discourse: Domains, debates, and directions. Academy of Management Annals, 6(1), 435481.

Poell, & van Dijck, 2015Poell, T., & van Dijck, J. (2015). Social media and activist communication. In C. Atton (Ed.), The Routledge companion to alternative and community media (pp. 527537). London: Routledge. Retrieved from

Ramsden, 2020Ramsden, P. (2020). How the pandemic changed social media and George Floyd’s death created a collective conscience. The Conversation. Retrieved from

Raynard, Kodeih, & Greenwood, 2020Raynard, M., Kodeih, F., & Greenwood, R. (2020). Proudly Elitist and Undemocratic? The distributed maintenance of contested practices. Organization Studies, 42(1), 733.

Rieder, 2012Rieder, B. (2012). What is in PageRank? A historical and conceptual investigation of a recursive status index. Computational Culture, 2.

Rodner, Roulet, Kerrigan, & Vom Lehn, 2020Rodner, V., Roulet, T. J., Kerrigan, F., & Vom Lehn, D. (2020). Making space for art: A spatial perspective of disruptive and defensive institutional work in Venezuela’s art world. Academy of Management Journal, 63(4), 10541081.

Roulet, 2020Roulet, T. J. (2020). The power of being divisive: Understanding negative social evaluations (illustrated edition). Redwood City, CA: Stanford Business Books.

Scheidgen, Gümüsay, Günzel-Jensen, Krlev, & Wolf, 2021Scheidgen, K., Gümüsay, A. A., Günzel-Jensen, F., Krlev, G., & Wolf, M. (2021). Crises and entrepreneurial opportunities: Digital social innovation in response to physical distancing. Journal of Business Venturing Insights, 15, e00222.

Schneiberg, & Lounsbury, 2017Schneiberg, M., & Lounsbury, M. (2017). Social movements and the dynamics of institutions and organizations. In R. Greenwood, C. Oliver, T. B. Lawrence, & R. E. Meyer (Eds.), The Sage handbook of organizational institutionalism (2nd ed., pp. 281310). London: Sage.

Seidel, Hannigan, & Phillips, 2020Seidel, V. P., Hannigan, T. R., & Phillips, N. (2020). Rumor communities, social media, and forthcoming innovations: The shaping of technological frames in product market evolution. Academy of Management Review, 45(2), 304324.

Seo, & Creed, 2002Seo, M.-G., & Creed, W. E. D. (2002). Institutional contradictions, praxis, and institutional change: A dialectical perspective. Academy of Management Review, 27(2), 222247.

Smith, & Tracey, 2016Smith, W. K., & Tracey, P. (2016). Institutional complexity and paradox theory: Complementarities of competing demands. Strategic Organization, 14(4), 455466.

Thompson, 2005Thompson, J. B. (2005). The new visibility. Theory, Culture & Society, 22(6), 3151.

Thornton, & Ocasio, 2008Thornton, P. H., & Ocasio, W. (2008). Institutional logics. In R. Greenwood, C. Oliver, R. Suddaby, & K. Sahlin-Andersson (Eds.), The Sage handbook of organizational institutionalism (pp. 99129). London: Sage.

Tolbert, & Zucker, 1999Tolbert, P. S., & Zucker, L. G. (1999). The institutionalization of institutional theory. In S. Clegg & C. Hardy (Eds.), Studying organization: Theory & method (pp. 169184). London: Sage.

Toubiana, & Zietsma, 2017Toubiana, M., & Zietsma, C. (2017). The message is on the wall? Emotions, social media and the dynamics of institutional complexity. Academy of Management Journal, 60(3), 922953.

Treem, & Leonardi, 2013Treem, J. W., & Leonardi, P. M. (2013). Social media use in organizations: Exploring the affordances of visibility, editability, persistence, and association. Annals of the International Communication Association, 36(1), 143189.

Treem, Leonardi, & van den Hooff, 2020Treem, J. W., Leonardi, P. M., & van den Hooff, B. (2020). Computer-mediated communication in the age of communication visibility. Journal of Computer-Mediated Communication, 25(1), 4459.

Trittin-Ulbrich, Scherer, Munro, & Whelan, 2021Trittin-Ulbrich, H., Scherer, A. G., Munro, I., & Whelan, G. (2021). Exploring the dark and unexpected sides of digitalization: Toward a critical agenda. Organization, 28(1), 825.

Tucker, Guess, Barberá, Vaccari, Siegel, Sanovich, Stukal, & Nyhan, 2018Tucker, J. A., Guess, A., Barberá, P., Vaccari, C., Siegel, A., Sanovich, S., Stukal, D., & Nyhan, B. (2018). Social media, political polarization, and political disinformation: A review of the scientific literature. Retrieved from or

Uldam, & Kaun, 2018Uldam, J., & Kaun, A. (2018). Theorizing civic engagement and social media. In M. Mortensen, C. Neumayer, & T. Poell (Eds.), Social media materialities and protest: Critical reflections (pp. 101115). London: Routledge.

Vaast, & Kaganer, 2013Vaast, E., & Kaganer, E. (2013). Social media affordances and governance in the workplace: An examination of organizational policies. Journal of Computer-Mediated Communication, 19(1), 78101.

Van Dijck, 2013Van Dijck, J. (2013). ‘You have one identity’: Performing the self on Facebook and LinkedIn. Media, Culture & Society, 35(2), 199215.

Van Dijck, & Poell, 2013Van Dijck, J., & Poell, T. (2013). Understanding social media logic. Media and Communication, 1(1), 214.

Van Rijmenam, & Logue, 2020Van Rijmenam, M., & Logue, D. (2020). Revising the ‘science of the organisation’: Theorising AI agency and actorhood. Innovation, 23(1), 127144.

Wang, Raynard, & Greenwood, 2021Wang, M. S., Raynard, M., & Greenwood, R. (2021). From grace to violence: Stigmatizing the medical profession in China. Academy of Management Journal, 64(6), 18421872.

Wang, Reger, & Pfarrer, 2021Wang, X., Reger, R. K., & Pfarrer, M. (2021). Faster, hotter, and more linked in: Managing social disapproval in the social media era. Academy of Management Review, 46(2), 275298.

Washington, & Ventresca, 2004Washington, M., & Ventresca, M. J. (2004). How organizations change: The role of institutional support mechanisms in the incorporation of higher education visibility strategies, 1874–1995. Organization Science, 15(1), 8297.

Wright, Meyer, Reay, & Staggs, 2020Wright, A. L., Meyer, A. D., Reay, T., & Staggs, J. (2020). Maintaining places of social inclusion: Ebola and the emergency department. Administrative Science Quarterly, 66(1), 4285.

Yin, 1994Yin, R. K. (1994). Case study research: Design and methods. Thousand Oaks, CA: Sage.

Zietsma, Groenewegen, Logue, & (Bob) Hinings, 2017Zietsma, C., Groenewegen, P., Logue, D. M., & (Bob) Hinings, C. R. (2017). Field or fields? Building the scaffolding for cumulation of research on institutional fields. Academy of Management Annals, 11(1), 391450.


We are grateful to the editorial team of this special volume, in particular Thomas Gegenhuber, and our two anonymous reviewers for their valuable comments. We would also like to express our appreciation to #SchauHin for offering us data access. Finally, we would like to thank the Humboldt Institute for Internet and Society for open access funding support.