Skip to main content
Proponents of generative AI tools claim they will supplement, even replace, the work of cultural production. This raises questions about the politics of visibility: what kinds of stories do these tools tend to generate, and what do they... more
Proponents of generative AI tools claim they will supplement, even replace, the work of cultural production. This raises questions about the politics of visibility: what kinds of stories do these tools tend to generate, and what do they generally not? Do these tools match the kind of diversity of representation that marginalized populations and non-normative communities have fought to secure in publishing and broadcast media? I tested three widely available generative AI tools with prompts designed to reveal these normative assumptions; I prompted the tools multiple times with each, to track the diversity of the outputs to the same query. I demonstrate that, as currently designed and trained, generative AI tools tend to reproduce normative identities and narratives, rarely representing less common arrangements and perspectives. When they do generate variety, it is often narrow, maintaining deeper normative assumptions in what remains absent.
Algorithms may now be our most important knowledge technologies, “the scientific instruments of a society at large,” and they are increasingly vital to how we organize human social interaction, produce authoritative knowledge, and... more
Algorithms may now be our most important knowledge technologies, “the scientific instruments of a society at large,” and they are increasingly vital to how we organize human social interaction, produce authoritative knowledge, and choreograph our participation in public life. Search engines, recommendation systems, and edge algorithms on social networking sites: these not only help us find information, they provide a means to know what there is to know and to participate in social and political discourse. If not as pervasive and structurally central as search and recommendation, trending has emerged as an increasingly common feature of such interfaces and seems to be growing in cultural importance. It represents a fundamentally different logic for how to algorithmically navigate social media: besides identifying and highlighting what might be relevant to “you” specifically, trending algorithms identify what is popular with “us” more broadly. But while the techniques may be new, the instinct is not: what today might be identified as “trending” is the latest instantiation of the instinct to map public attention and interest, be it surveys and polling, audience metrics, market research, forecasting, and trendspotting. Understanding the calculations and motivations behind the production of these “calculated publics,”in this historical context, helps highlight how these algorithms are relevant to our collective efforts to know and be known. Rather than discuss the effect of trending algorithms, I want to ask what it means that they have become a meaningful element of public culture. Algorithms, particularly those involved in the movement of culture, are both mechanisms of distribution and valuation, part of the process by which knowledge institutions circulate and evaluate information, the process by which new media industries provide and sort culture. This essay examines the way these algorithmic techniques themselves become cultural objects, get taken up in our thinking about culture and the public to which it is addressed, and get contested both for what they do and what they reveal. We should ask not just how algorithms shape culture, but how they become culture.
Hosted by Northumbria and Birmingham City Universities, the Deplatforming Sex roundtable took place via Teams in October 2021. Participants included Danielle Blunt, Stefanie Duguay, Tarleton Gillespie and Sinnamon Love. Clarissa Smith... more
Hosted by Northumbria and Birmingham City Universities, the Deplatforming Sex roundtable took place via Teams in October 2021. Participants included Danielle Blunt, Stefanie Duguay, Tarleton Gillespie and Sinnamon Love. Clarissa Smith chaired the discussion, which was transcribed and then edited to cut digressions and repetitions for publication. The roundtable provided the opportunity to reflect on recent moves to excise sex and forms of sexual commerce and performance from online spaces, while marking out some key issues for future research with and about sex workers, performers and other content providers. Our discussion provided critical engagement with ongoing legislative changes that are impacting content and providers directly and indirectly
Public debate about content moderation has overwhelmingly focused on removal: social media platforms deleting content and suspending users, or opting not to do so. However, removal is not the only available remedy. Reducing the visibility... more
Public debate about content moderation has overwhelmingly focused on removal: social media platforms deleting content and suspending users, or opting not to do so. However, removal is not the only available remedy. Reducing the visibility of problematic content is becoming a commonplace element of platform governance. Platforms use machine learning classifiers to identify content they judge misleading enough, risky enough, or offensive enough that, while it does not warrant removal according to the site guidelines, warrants demoting them in algorithmic rankings and recommendations. In this essay, I document this shift and explain how reduction works. I then raise questions about what it means to use recommendation as a means of content moderation.
Recent social science concerning the information technology industries has been driven by a sense of urgency around the problems social media platforms face. But it need not be our job to solve the problems these industries have created,... more
Recent social science concerning the information technology industries has been driven by a sense of urgency around the problems social media platforms face. But it need not be our job to solve the problems these industries have created, at least not on the terms in which they offer them. When researchers are enlisted in solving the industry’s problems, we tend to repeat some of the missteps common to the study of technology and society.
With increasing attention to the labor, criteria, and implications of content moderation, come opportunities for real change in the ways that platforms are governed. After high profile exposes like The Guardian’s “Facebook Files,” it is... more
With increasing attention to the labor, criteria, and implications of content moderation, come opportunities for real change in the ways that platforms are governed. After high profile exposes like The Guardian’s “Facebook Files,” it is becoming more difficult for platforms to regulate in secret. Governments around the world are increasingly seeking to influence moderation practices, and platforms now face substantial pressure from users, civil society, and industry groups to do more on specific issues like terrorism, hatred, ‘revenge porn, and ‘fake news.’ In light of this pressure and the opportunities it implies, this roundtable will consider options for the future of content moderation. The question is not just how the moderation apparatus should change, but what principles should guide these changes? This panel brings together perspectives from media and information studies, law, and civil society to explore a variety of approaches to regulation, from corporate self-governance ...
Recent social science concerning the information technology industries has been driven by a sense of urgency around the problems social media platforms face. But it need not be our job to solve the problems these industries have created,... more
Recent social science concerning the information technology industries has been driven by a sense of urgency around the problems social media platforms face. But it need not be our job to solve the problems these industries have created, at least not on the terms in which they offer them. When researchers are enlisted in solving the industry’s problems, we tend to repeat some of the missteps common to the study of technology and society.
Public debate about content moderation has overwhelmingly focused on removal: social media platforms deleting content and suspending users, or opting not to do so. However, removal is not the only available remedy. Reducing the visibility... more
Public debate about content moderation has overwhelmingly focused on removal: social media platforms deleting content and suspending users, or opting not to do so. However, removal is not the only available remedy. Reducing the visibility of problematic content is becoming a commonplace element of platform governance. Platforms use machine learning classifiers to identify content they judge misleading enough, risky enough, or offensive enough that, while it does not warrant removal according to the site guidelines, warrants demoting them in algorithmic rankings and recommendations. In this essay, I document this shift and explain how reduction works. I then raise questions about what it means to use recommendation as a means of content moderation.
Online content providers such as YouTube are carefully positioning themselves to users, clients, advertisers, and policymakers, making strategic claims as to what they do and do not do, and how their place in the information landscape... more
Online content providers such as YouTube are carefully positioning themselves to users, clients, advertisers, and policymakers, making strategic claims as to what they do and do not do, and how their place in the information landscape should be understood. One term in particular, ‘platform, ’ reveals the contours of this discursive work. ‘Platform ’ has been deployed in both their populist appeals and their marketing pitches – sometimes as technical platforms, sometimes as platforms from which to speak, sometimes as platforms of opportunity. Whatever tensions exist in serving all of these constituencies are carefully elided. The term also fits their efforts to shape information policy, where they seek protection for facilitating user expression, yet also seek limited liability for what those users say. As these providers become the curators of public discourse, we must examine the roles they aim to play, and the terms with which they hope to be judged.
[Abstract Not Available]WOS:00046537130001
Algorithms (particularly those embedded in search engines, social media platforms, recommendation systems, and information databases) play an increasingly important role in selecting what information is considered most relevant to us, a... more
Algorithms (particularly those embedded in search engines, social media platforms, recommendation systems, and information databases) play an increasingly important role in selecting what information is considered most relevant to us, a crucial feature of our participation in public life. As we have embraced computational tools as our primary media of expression, we are subjecting human discourse and knowledge to the procedural logics that undergird computation. What we need is an interrogation of algorithms as a key feature of our information ecosystem, and of the cultural forms emerging in their shadows, with a close attention to where and in what ways the introduction of algorithms into human knowledge practices may have political ramifications. This essay is a conceptual map to do just that. It proposes a sociological analysis that does not conceive of algorithms as abstract, technical achievements, but suggests how to unpack the warm human and institutional choices that lie beh...
How do we know if we are where it’s at? Tarleton Gillespie explores the controversy over Twitter Trends and the algorithmic ‘censorship’ of #occupywallstreet.
Research Interests:
In this paper, nine scholars consider how to expand research on Internet platforms’ content moderation. We consider research on encrypted platforms such as WhatsApp; on short-lived but briefly popular platforms; on cloud computing... more
In this paper, nine scholars consider how to expand research on Internet platforms’ content moderation. We consider research on encrypted platforms such as WhatsApp; on short-lived but briefly popular platforms; on cloud computing platforms and throughout the infrastructure. We address the implications of content moderation for civic action in open societies, content moderation as regulation, for our political lives, the quality of our cultures and daily lives; and opportunities and for regulatory approaches.
Social media platforms don’t just guide, distort, and facilitate social activity, they also delete some of it. They don’t just link users together, they also suspend them. They don’t just circulate our images and posts, they also... more
Social media platforms don’t just guide, distort, and facilitate social activity, they also delete some of it. They don’t just link users together, they also suspend them. They don’t just circulate our images and posts, they also algorithmically promote some over others. Platforms pick and choose.
Author(s): Gillespie, Tarleton | Abstract: How do we know if we are where it’s at? Tarleton Gillespie explores the controversy over Twitter Trends and the algorithmic ‘censorship’ of #occupywallstreet.
In this paper, we examine the inverse and converging movement of two sets of institutions: news organizations, as they find that part of their mission necessarily includes hosting an unruly user community that does not always play by the... more
In this paper, we examine the inverse and converging movement of two sets of institutions: news organizations, as they find that part of their mission necessarily includes hosting an unruly user community that does not always play by the norms of journalism; and online media ...
This article considers how media workers and organizations make use of the abundance of metrics available in the contemporary online environment. The expansion of audience measurement on digital music platforms, dashboard analytics, and... more
This article considers how media workers and organizations make use of the abundance of metrics available in the contemporary online environment. The expansion of audience measurement on digital music platforms, dashboard analytics, and third-party providers raises broad societal concerns about the quantification of culture; however, less attention has been paid to how professionals in the music industries approach, understand, and deploy these metrics in their work. Drawing on survey and interview data, we found that music workers do not take metrics on faith or reject them out of hand; rather, they make sense of them, deploy them strategically, and narrate their meanings to give themselves rationales to make investments and predictions and to persuade others to do so.
This essay introduces the special issue of Social Media + Society curated by the editors of Culture Digitally and drawn from the community of Culture Digitally contributors.
We conclude with a series of questions and answers about how different stakeholders can help combat mediated misogyny and contribute to a safer world: digital platforms, journalism, the law, and universities. Experts in each of these... more
We conclude with a series of questions and answers about how different stakeholders can help combat mediated misogyny and contribute to a safer world: digital platforms, journalism, the law, and universities. Experts in each of these fields present tangible advice, ethics, and guidelines for changing systems of power and challenging misogyny.
In this interdisciplinary roundtable discussion, five scholars interested in political communication work through the democratic dilemmas created when privately owned social media platforms are used as digital public squares by elected... more
In this interdisciplinary roundtable discussion, five scholars interested in political communication work through the democratic dilemmas created when privately owned social media platforms are used as digital public squares by elected officials in the United States. This conversation unfolds in the context of ongoing legal cases that challenge politicians’ efforts to block select interlocutors and bar them from participation. We grapple with the tension between politicians’ use of social media to broadcast their own messages as a form of publicity with the desire by some members of the public that politicians be transparent online by allowing the electorate to question or even criticize them. Through this discussion, we weigh the importance of the right to criticize the government and its leaders alongside the realities of contentious content on social media platforms that are rife with abusive content, in a cultural context marked by social inequalities.
Content moderation has exploded as a policy, advocacy, and public concern. But these debates still tend to be driven by high-profile incidents and to focus on the largest, US based platforms. In order to contribute to informed... more
Content moderation has exploded as a policy, advocacy, and public concern. But these debates still tend to be driven by high-profile incidents and to focus on the largest, US based platforms. In order to contribute to informed policymaking, scholarship in this area needs to recognise that moderation is an expansive socio-technical phenomenon, which functions in many contexts and takes many forms. Expanding the discussion also changes how we assess the array of proposed policy solutions meant to improve content moderation. Here, nine content moderation scholars working in critical internet studies propose how to expand research on content moderation, with implications for policy.
Social media platforms have profoundly transformed cultural production, in part by restructuring the terms by which culture is distributed and paid for. In this article, we examine the YouTube Partner Program and the controversies around... more
Social media platforms have profoundly transformed cultural production, in part by restructuring the terms by which culture is distributed and paid for. In this article, we examine the YouTube Partner Program and the controversies around the “demonetization” of videos, to understand these arrangements and what happens when they shift beneath creators’ feet. We use the testimony of YouTubers, provided in their own videos, to understand how creators square the contradiction between YouTube’s increasingly cautious rules regarding “advertiser-friendly” content, its shifting financial and algorithmic incentive structure, and its stated values as an open platform of expression. We examine YouTube’s tiered governance strategy, in which different users are offered different sets of rules, different material resources, and different procedural protections when content is demonetized. And we examine how, especially when the details of that tiered governance are ambiguous or poorly conveyed, c...
Tradução do artigo originalmente publicado como "The relevance of algorithms", de autoria do prof. Tarleton Gillespie, no livro “Media Technologies: Essays on Communication, Materiality, and Society" editado por Tarleton Gillespie, Pablo... more
Tradução do artigo originalmente publicado como "The relevance of algorithms", de autoria do prof. Tarleton Gillespie, no livro “Media Technologies:  Essays on Communication, Materiality, and Society" editado por Tarleton Gillespie, Pablo J. Baczkowski, and Kirsten A. Foot e publicado pela The MIT Press em 2014.

A presente versão foi traduzida por mim, com revisão de Carlos d’Andréa, e publicada na Revista Parágrafo na edição de 2018 - (jan/abril), volume 6 número 1, como cortesia da editora estadunidense.
Research Interests:

And 33 more

Research Interests:
This paper considers how media workers and organizations make use of the abundance of metrics available to producers in the contemporary online environment. While the expansion of audience measurement on digital music platforms, dashboard... more
This paper considers how media workers and organizations make use of the abundance of metrics available to producers in the contemporary online environment. While the expansion of audience measurement on digital music platforms, dashboard analytics, and third-party providers raise broad societal concerns about the quantification of culture, less attention is paid to how professionals in the music industries approach, understand, and deploy these metrics in their work. Drawing on survey and interview data, we find that music workers do not take metrics on faith or reject them out of hand; rather, they make sense of them, deploy them strategically, and narrate their meanings to give themselves rationales to make investments and predictions, and to persuade others to do so. The paper shows that the longstanding quest to “know the audience” has not been fulfilled by the rise of digitalization or the expansion of audience measurement techniques. Instead, attention to data has gotten more granular, more in need of triangulation with numerous other data sources, and at times more neurotic. The media industries continue to be a place where workers “make do” with the resources provided by the systems of which they are part, and upon which they increasingly depend, in order to manage the uncertainty that is endemic to the business of culture.