Introduction
The Power of Algorithms
‘This book is about the power of algorithms in the age
cof neoliberalism and the ways those digital decisions
reinforce oppressive social relationships and enact
‘new modes of racial profiling, which I have termed
technological rediining. By making visible the ways
‘that capital, ace, and gender are factors in creating,
‘unequal conditions, I am bringing light to various
forms of technological redlining that are on the rise.
‘The near-ubiqutous use of algorithmically driven
software, bth visible and invisible to everyday people,
demands a closer inspection of what values are
prioritized in such automated decision-making
systems. Typically, the practice of reining has boen
‘most often used in real estate and banking circles,
creating and deepening inequalities by race, such that,
for example, people of color are more likely to pay
higher interest rates or premiums just because they
are Black or Latino, especially if they live in low-
‘income neighborhoods. On the Internet and in our
everyday uses of technology; discrimination is also
embedded in computer code and, increasingly, in
artificial intelligence technologies that we are reliant
‘0, by choice or not, I believe that artificial intelligence
will become a major human rights issue in the twenty-
first century. We are only beginning to understand the
longeterm consequences of these decision-making
tools in both masking and deepening social inequality.
‘This book is just the start of trying to make these
consequences visible. There will be many more, by
‘myself and others, who will try to make sense of the
consequences of automated decision making through
algorithms in society.
Part of the challenge of understanding algorithmic
oppression is to understand that mathematical
formulations to drive automated decisions are made
by human beings. While we often think of terms such
as "big data” and “algorithms” as being benign,
neutral, or objective, they ae anything but. The people
‘who make these decisions hold all types of values,
‘many of which openly promote racism, sexism, and
{alse notions of meritocracy, which is well documented
in studies of Silicon Valley and other tech corridors.
For example, in the midst of a federal investigation
‘of Google's alleged persistent wage gap, where women
are systematically paid ess than men in the company’s
‘Workforce, an “antidiversity” manifesto authored byJames Damore went viral in August 2017» supported
bby many Google employees, arguing that women are
paychologially inferior and incapable of being as good
at software engineering as men, among other patently
false and sexist assertions, As this book was moving
into press, many Google executives and employees
‘were actively rebuking the assertions of this enginoer,
‘who reportedly works on Google search infrastructure.
Legal cases have been filed, boycotts of Google from
‘the politcal far right in the United States have been
invoked, and calls for greater expressed commitments
to gender and racial equity at Google and in Silicon
Valley writ large are under way. What this
antidiversty sereed has underscored for me as I write
this book is that some of the very people who are
developing search algorithms and architecture are
‘willing to promote sexist and racist attitudes openly at
‘work and beyond, while we are supposed to believe
that these same employees are developing “neutral” or
“objective” decision-making tools. Human beings are
developing the digital platforms we use, and as 1
present evidence ofthe recklessness and lack of regard
that is often shown to women and people of color in
some of the output of these systems, it will become
increasingly difficult for technology companies to
separate their systematic and inequitable employment
practices, and the far-right ideological bents of some of
their employees, from the products they make for the
public.
‘My goal in this book is to further an exploration into
some ofthese digital sense-making processes and how
they have come to be so fundamental to the
classification and organization of information and at
‘hat cost. As @ result, this book is largely concerned
‘with examining the commercial co-optation of Black
‘dentiies, experiences, and communities in the largest
and most powerful technology companies to date,
namely, Google, I closely read a few distinet cases of.
algorithmic oppression for the depth of their social
‘meaning to raise a public diseussion of the broader
‘implications of how privately managed, black-boxed
information-sorting tools have become essential. to
riven decisions. I want us to have broader
ions of the
artificial intelligentsia for people who are already
systematically marginalized and oppressed. I will also
provide evidence and argue, ultimately, that large
technology monopolies such as Google need to be
broken up and regulated, because their consolidated
power and cultural influence make competition largely
‘impossible. This monopoly inthe information sector is
a threat to democracy, as is currently coming to the
{ore as we make sense of information flows through
digital media such as Google and Facebook in the wakeof the 2016 United States presidential election,
situate my work against the backdrop of a twelve-
year professional career in multicultural marketing
land advertising, where I was invested in building
corporate brands and selling produets to African
Americans and Latinos (before T became a university
professor). Back then, I believed, like many urban
‘marketing professionals, that companies must pay
attention to the needs of people of color and
demonstrate respect for consumers by offering
services to communities of color, just as is done for
‘most everyone else. After all, to be responsive and
responsible to marginalized consumers was to create
‘more market opportunity. I spent an equal amount of
time doing risk management and public relations to
insulate companies from any adverse risk to sales that
they might experience from inadvertent or deliberate
snubs to consumers of color who might perceive a
brand as racist or insensitive. Proteeting my former
clients from enacting racial and gender insensitivity
‘and helping them bolster their brands by ereating deep
‘emotional and psychological attachments to their
products among communities of color was my
professional concern for many years, which made an
‘experience I had in fall 2010 deeply impactful. In just a
{ew minutes while searching on the web, 1 experienced
‘the perfect storm of insult and injury that I could not
‘tum away from. While Googling things on the Internet
that might be interesting to my stepdaughter and
nieces, I was overtaken by the results. My search on
the keywords “black girls” yielded HotBlackPussy.com
as the first it,
Hit indeed.
Since that time, I have spent innumerable hours
twaching and researching all the ways in which it could
be that Google could completely fil when it came to
providing reliable or credible information about
‘women and people of color yet experience seemingly
no repercussions whatsoever. Two years after this
incident, I collected searches again, only to find similar
results, as documented in figure
Figure First search result on keywords “black
sirls” September 2011,
In 2012, wrote an article for Bitch magazine about
‘how women and feminism are marginalized in search
results, By August 2012, Panda (an update to Google's
search algorithm) had been released, and pornography
‘was no longer the first series of results for “black
girls’; but other girls and women of eolor, such asLatinas and Asians, were stil pornified. By August of
that year, the algorithm changed, and porn was
suppressed in the case of a search on “black girl.” I
often wonder what kind of pressures account for the
changing of search results over time. Its impossible to
know when and what influences proprietary
algorithmic design, other than that human beings are
designing them and that they are not up for public
discussion, except as we engage in ertique and
protest.
‘This book was born to highlight cases of such
algorithmically driven data failures that are specific to
people of color and women and to underscore the
structural ways that racism and sexism are
fundamental to what I have coined algorithmic
‘oppression. 1am writing in the sprit of other critical
‘women of color, such as Latoya Peterson, cofounder of
the blog Racialicious, who has opined that racism is
‘the fundamental application program interface (API)
fof the Internet. Peterson has argued that anti
Blackness is the foundation on which all racism
toward other groups is predicated. Racism is a
standard protocol for organizing behavior on the web.
‘As she has sai, so perfectly, “The idea of a n" ager APL
‘makes me think of a racism API, which is one of our
core arguments all along—oppression operates in the
same formats, runs the same seripts over and aver. It
{is tweaked to be contest specific, but it's all the same
souree code. And the key to its undoing is recognizing
hhow many of us are ensnared in these same basie
patterns and modifying our own actions”: Peterson's
allegation is consistent with what many people feel
about the hostility of the web toward people of color,
particulary in its anti-Blackness, which any perusal of
YouTube comments or other message boards will serve
‘up. On one level, the everyday racism and commentary
fon the web is an abhorrent thing in itself, which has
been detailed by others; but itis entirely different with
the corporate platform vis-a-vis an algorithmically
crafted web search that offers up racism and sexism as
the first results. This process reflects a corporate logic
of either willful neglect or a profit imperative that
‘makes money from racism and sexism. "This inquiry is
the basis ofthis book.
In the following pages, I diseuss how “hot,” “sugary,”
or any other kind of “black pussy” can surface as the
primary representation of Black girls and women on
the first page of a Google search, and I suggest that
something other than the best, most eredible, or most
reliable information output is driving Google. Of
course, Google Search is an advertising company, not &
reliable information company. At the very Teast, we
‘must ask when we find these kinds of results, Is this
the best information? For whom? We must ask‘ourselves who the intended audience is fora variety of
‘things we find, and question the legitimacy of being in
a “Slker bubble,” when we do not want racism and
sexism, yet they still ind their way to us. The
‘implications of algorithmic decision making. of this
sort extend to other types of queries in Google and
‘other digital media platforms, and they are the
beginning of a much-needed reassessment of
{information as a public good. We need a full-on
reevaluation of the implications of our information
resources being governed by corporate-controlled
advertising companies. I am adding my voice to @
‘number of scholars such as Helen Nissenbaum and
Lucas Introna, Siva Vaidhyanathan, Alex Halavals,
Christian Fuchs, Frank Pasquale, Kate Crawford,
‘Tarleton Gillespie, Sarah T. Roberts, Jaron Lanier, and
Elad Segey, to name a fev, who are raising eitiques of
Google and other forms of corporate information
contro (including artificial intelligence) in hopes that
‘more people will consider alternatives.
Over the years, have concentrated my research on
‘unveiling the many ways that African American people
have been contained and constrained in classification
systems, from Google's commercial search engine to
library databases. The development of this.
concentration was bor of my research training in
IUbrary and information science. I think ofthese issues
through the lenses of critical information studies and
critieal race and gender studies, As marketing, and
advertising have directly shaped the ways that
‘marginalized people have come to be represented by
digital records such as search results or social network
activities, I have studied why itis that digital media
platforms are resoundingly characterized as “neutral
technologies" in the public domain and often,
‘unfortunately, in academia. Stories of “glitches” found
{in systems do not suggest that the organizing logies of
‘the web could be broken but, rather, that these are
‘occasional one-off moments when something goes
terribly wrong with near-perfect systems. With the
exception of the many scholars whom I reference
‘throughout this work and the journalists, bloggers,
and whistleblowers whom I will be remiss in not
‘naming, very few people are taking notice. We need all
the voices to come tothe fore and impact public poliey
fon the most unregulated social experiment of our
times: the Internet.
“These data aberrations have come to light in various
forms. In 2015, U'S. News and World Report reported
‘that a “glitch” in Google's algorithm led to a number of
problems through auto-tagging and facial-recognition
software that was apparently intended to help people
search through images more successfully. The frst
problem for Google was that its photo application hadautomatically tagged Afriean Americans as “apes” and
animals: The second major issue reported by the
Post was that Google Maps searches on the word
“Negger”: led to a map of the White House during
Obama’s presidency, a story that went viral on the
Internet after the social media personality Deray
MeKesson tweeted it
‘These incidents were consistent with the reports of
Photoshopped images of a monkey’ face on the image
of First Lady Michelle Obama that were circulating
‘through Google Images search in 2009. In 2015, you
could still find digital traces of the Google
autosuggestions that associated Michelle Obama with
apes. Protests from the White House led to Google
forcing the image down the image stack, from the fist
page, so that it was not as visible» In each case,
Google's postion is that it is not responsible for its
algorithm and that problems with the results would be
quickly resolved. In the Washington Post article about
“N*gger House,” the response was consistent with
‘other apologies by the company: “Some inappropriate
results are surfacing in Google Maps that should not
be, and we apologize for any offense this may have
caused,’ a Google spokesperson told U.S. News in an
‘email late Tuesday. ‘Our teams are working to fix this
issue quickly.”
ferme
ia:
Qn. Se, Se. {
Figure 2. Google Images results for the keyword
“gorillas,” April 7, 2016.
Figure L3, Google Maps search on "Naga House
Teads to the White House, April 7, 2016.e-—
yo Gio Mp apo”
wanee
Figure 4. Tweet by Deray MeKesson about Google
‘Maps search and the White House, 2015.
con |
Figure 5. Standard Google's “related” searches
associates “Michelle Obama” with the term “ape”
‘These human and machine errors are not without
consequence, and there are several cases that
demonstrate how racism and sexism are part of the
architecture and language of technology, an issue that
reeds attention and remediation. In many ways, these
cases that I present are specific to the lives and
experiences of Black women and gies, people largely
“understudied by scholars, who remain ever precarious,
despite our living in the age of Oprah and Beyones in
Shondaland. The implications of such marginalization
are profound. The insights about sexist or racist biases
that I convey here are important because information
organizations, from libraries to schools and
“universities to governmental agencies, are increasingly
reliant on or being displaced by a variety of web-based
“tools” as if there are no political, socal, or economic
consequences of doing so. We need to imagine new
possibilities in the area of information access and
knowledge generation, particularly as headlines about
“racist algorithms” continue to surface in the media
with limited discussion and analysis beyond the
superfic
Inevitably, a book written about algorithms or
Google in the twenty-first century Is out of date
‘immediately upon printing. Technology is changing
rapidly, as are technology company configurations via‘mergers, acquisitions, and dissolutions. Scholars
‘working in the fields of information, communication,
fand technology struggle to write about specific
‘moments in time, in an effort to erytallize a process or
a phenomenon that may shift or morph into
something else soon thereafter. As a scholar of
information and power, 1 am most interested in
communicating a series of processes that have
happened, which provide evidence of constellation of
coneems that the public might take up as meaningful
and important, particularly as technology impacts
social relations and creates unintended consequences
that deserve greater attention. I have been writing this
book for several years, and over time, Google's
algorithms have admittedly changed, such that a
search for “black girls” does not yield nearly as many
pornographic results now as it did in 20m.
"Nonetheless, new instances of racism and sexism keep
appearing in news and social media, and so I use @
variety of these cases to make the point that
algorithmic oppression is not just a gliteh in the
system but, rather, s fundamental to the operating
system of the web, It has direct impact on users and on
‘oar lives beyond using Internet applications. While 1
have spent considerable time researching Google, this
book tackles 2 few cases of other algorithmically driven
platforms to illustrate how algorithms are serving up
deleterious information about people, cresting, and
normalizing structural and systemic isolation, or
practicing digital redlining, all of which reinforce
‘oppressive socal and economic relations.
‘While organizing this book, I have wanted to
emphasize one main point: there is a missing social
and human context in some types of algorithmically
driven decision making, and this matters for everyone
‘engaging with these types of technologies in everyday
life. It is of particular concern for marginalized groups,
those who are problematically represented in
erroneous, stereotypical, or even pornographic ways in
search engines and who have also struggled for
rnonstereotypical or nonracist and nonsesist depictions
in the media and in libraries. There is « deep body of
extant research on the harmful effects of stereotyping.
fof women and people of color in the media, and 1
encourage readers ofthis book who do not understand
why the perpetuation of racist and sexist images in
society is problematic to consider a deeper dive into
such scholars
“This book is organized into six chapters. In chapter
1, Texplore the important theme of corporate control
‘over publi information, and I show several key Google
searches. I look to see what kinds of results Google's
search engine provides about various concepts, and 1
offer a cautionary discussion of the implications of‘hat these results mean in historical and social
contexts. I also show what Google Images offers on
basic concepts such as “beauty” and various
professional identities and why we should care.
In chapter 2, diseuss how Google Search reinforces
stereotypes, illustrated by searches on a variety of
identities that include “black girls,” “Latinas,” and
“Asian girls.” Previously, in my work published in the
Black Scholar, 1 looked at the postmortem Google
autosuggest searches following the death of Trayvon
‘Martin, an African American teenager whose murder
ignited the #BlackLivesMatter movement on Twitter
and brought attention to the hundreds of African
‘American children, women, and men killed by police
for extrajudicial law enforcement. To add a fuller
discussion to that research, I elucidate the processes
involved in Google’s PageRank search protocols, which
range from leveraging digital footprints from people.
to the way advertising and marketing interests
influence search results to how beneficial tis isto the
interests of Google as it profits from racism and
sexism, particularly at the height of a media spectacle
In chapter 3, I examine the importance of
‘noncommercial search engines and information
portals, specifically looking atthe case of how a mass
shooter and avowed White supremacist, Dylann Roof,
allegedly used Google Search in the development of his
racial attitudes, attitudes that led to his murder of nine
African American AME Church members while they
‘worshiped in their South Carolina church in the
summer of 2015. The provision of false information
that purports to be credible news, and the devastating
consequences that can come from this kind of
algorithmically driven information, is an example of
why we cannot afford to outsource and privatize
‘uneurated information on the inereasingly neoliberal,
privatized web. I show how important records are to
the public and explore the social importance of both
remembering and forgetting, as digital media
platforms thrive on never or rarely forgetting, I discuss
‘how information online functions as a type of record,
and T argue that much of this information and its
harmful effects should be regulated or subject to legal
protections. Furthermore, at atime when “right to be
{orgotten” legislation is gaining steam in the European
Union, efforts to regulate the ways that technology
companies hold a monopoly on public information
about individuals and groups need further attention in
the United States. Chapter 3 is about the future of
information culture, and it underscores the ways that
information is not neutral and how we can reimagine
{information culture inthe service of eradicating social
inequality
Chapter 4 is dedicated to critiquing the field of{information studies and foregrounds how these issues
cof public information through classification projects
fon the web, such as commercial search, are old
problems that we must solve as a scholarly field of
researchers and practitioners. I offer a brief survey of
how library classification projects undergird the
invention of search engines such as Google and how
‘our field is implicated in the algorithmic process of
sorting and classifying information and records. In
chapter 5, I discuss the future of knowledge in the
public and reference the work of library and
{information professionals, in particular, as important
to the development and cultivation of equitable
classification systems, sine these are the precursors to
‘commercial search engines. This chapter is essential
history for library and information professionals, who
are less likely to be trained on the polities of
cataloguing and classification bias in thelr professional
‘raining. Chapter 6 explores publi policy and why we
‘need regulation in our information environments,
particularly as they are increasingly controlled by
corporations.
‘To conclude, I move the diseussion beyond Google,
to help readers think about the impact of algorithms
fon how people are represented in other seemingly
benign business transactions. Look at the “colorblind”
‘organizing logie of Yelp and how business owners are
revolting due to loss of control over how they are
represented and the impact of how the public finds
them, Here, I share an interview with Kandis from
New York whose livelihood has been dramatically
affected by public-poliey changes such as the
dismantling of affirmative action on college campuses,
‘hich have hurt her local lack-hair-care business in @
prestigious college town, Her story brings to light the
power that algorithms have on her everyday life and
Jeaves us with more to think about inthe ecosystem of
algorithmic power. The book closes with a call to
recognize the importance of how algorithms are
shifting social relations in many ways—more ways
than this book can cover—and should be regulated
with more impactful public policy in the United States
‘than we currently have. My hope is that this book will
directly impact the many kinds of algorithmic
decisions that ean have devastating consequences for
people who are already marginalized by institutional
racism and sexism, including the 99% who own so
litle wealth in the United States that the alarming
trend of social inequality is not likely to reverse
without our active resistance and intervention,
Electoral polities and financial markets are just two of
‘many of these institutional wealth-consoldation
projects that are heavily influenced by algorithms and
artificial intelligence. We need to eause a shift n whatwwe take for granted in our everyday use of digital
‘media platforms.
1 consider my work a practical project, the goal of
‘Which is to eliminate social injustice and change the
‘ways in which people are oppressed with the aid of
allegedly neutral technologies. My intention in looking
at these eases serves two purposes. First, we need
interdiscipinary research and scholarship in
information studies and brary and information
seience that intersects with gender and women's
studies, Black/African American studies, media
‘understand how algorithmically driven platforms are
situated in intersectional sociohistorial contexts and.
‘embedded within socal relations. My hope is that this
work will add to the voices of my many colleagues
across several fields who are raising questions about
the legitimacy and social consequences of algorithms
and artificial intelligence. Second, now, more than
ever, we need experts in the social seiences and digital
Jhumanities to engage in dialogue with activists and
organizers, engineers, designers, information
technologists, and publie-policy makers before blunt
antfcial-intelligence decision making trumps nuanced
Jhuman decision making, This means that we must
look at how the outsourcing of information practices
‘rom the public sector faelittes privatization of what
wwe previously thought of as the publie domain: and
hhow corporate-controlled governments and companies
subvert our ability to intervene in these practices.
‘We have to ask what is lost, who is harmed, and
‘what should be forgotten with the embrace of artificial
intelligence in decision making. It is of no collective
social benefit to organize information resources on the
web through processes that solidify inequality and
‘marginalization—on that point I am hopeful many
people will agree,A Society, Searching
(On October 21, 2019, the United Nations launched
campaign directed by the advertising ageney Memae
Ogiky & Mather Dubai using “genuine Google
searches" to bring attention to the sexist and
discriminatory ways in which women are regarded and
denied human rights. Christopher Hunt, art director of
the campaign, said, “When we came across these
searches, we were shocked by how negative they were
and decided we had to do something with them.”
Kareem Shuhaibar, a copywriter for the campaign,
described on the United Nations website what the
campaign was determined to show: “The ads are
shocking because they show just how far we still have
to go to achieve gender equality. They are a wake up
call, and we hope that the message will travel fa.”
‘Over the mouths of various women of color were the
autosuggestions that reflected the most_ popular
searches that take place on Google Search. The Google
Search autosuggestions featured a range of sexist ideas
such a the following:
+ +Women cannot: drive, be bishops, be
‘rusted, speak in church
++ Women should not: have rights, vote,
work, box.
++ Women should: stay at home, be slaves,
bein the kitchen, not speak in church
++ Women need to: be put in their places,
know their place, be controled, be
disciplined
‘While the campaign employed Google Search results
to make a larger point about the status of public
‘opinion toward women, it also served, perhaps
‘unwittingly, to underscore the Incredibly powerful
nature of search engine results. The campaign suggests
‘that search isa mirror of users’ beliefs and that society
still holds a variety of sexist ideas about women. What
1 find troubling is that the campaign also reinforces
the idea that it is not the search engine that is the
problem but, rather, the users of search engines who
fare. It suggests that what is most popular is simply‘hat rises to the top ofthe search pile. While serving
as an important and disturbing critique of sexist
attitudes, the campaign fails to implicate the
algorithms or search engines that drive certain results
to the top. This chapter moves the lens onto the search
architecture itself in order to shed light on the many
factors that keep sexist and racist ideas on the first
page.
Figure 11. Memae Ogilvy & Mather Dubai advertising
‘campaign forthe United Nations.
One limitation of looking at the implications of
search is that it is constantly evolving and shifting over
time. This chapter captures aspects of commercial
search at particular moment—from 2009 to 2015—
Dut surely by the time readers engage with it, it willbe
‘4 historia rather than contemporary study.
‘Nevertheless, the goal of such an exploration of why
‘we get troublesome search results is to help us think
bout whether it truly makes sense to outsouree all of
‘our knowledge needs to commercial search engines,
particularly at a time when the public is increasingly
reliant on search engines in lieu of libraries, librarians,
teachers, researchers, and other knowledge keepers
and resources.
What is even more crucial is an exploration of how
people living as minority groups under the influence of
a majority culture, such as people of eolor and sexual
‘minorities inthe United States, are often subject to the
whims of the majority and other commercial
influences such as advertising when trying to affect the
kinds of results that search engines offer about them
and their identities. If the majority rules in search
fengine results, then how might those who are in the
‘minority ever be able to influence or control the way
they are represented in a search engine? The same
‘might be true of how men's desires and usage of search
is able to influence the values that surround women's
‘identities in search engines, as the Ogilvy campaign
might suggest. For these reasons, a deeper exploration
into the historical and social conditions that give rise
to problematic search results is in order, since rarely
are they questioned and most Internet users have no
‘dea how these ideas come to dominate search results
‘onthe first page of results inthe first place.Google Search: Racism
and Sexism at the
Forefront
‘My frst encounter with racism in search came to me
‘through an experience that pushed me, as a
researcher, to explore the _mechanisms—both
technological and social—that could render the
pornifiation of Black women a top search result,
naturalizing Black women as sexual objects so
effortlessly. This encounter was in 2009 when T was
talking to a friend, André Brock at the University of
Michigan, who causally mentioned one day, “You
should see what happens when you Google ‘black
girls." I did and was stunned, I assumed it to be an
aberration that could potentially shift overtime. I kept
thinking about it, The second time came one spring
‘morning in 2011, when I searched for activities to
entertain my preteen stepdaughter and her eousins of.
similar age, all of whom had made a weekend visit to
‘my home, ready for a day of hanging out that would
inevitably include time on our laptops. In order to
Deak them away from mindless TV watching and
cellphone gazing, I wanted to engage them in
conversations about what was important to them and
‘on their mind, from their perspective as young women
srowing up in downstate Ilinois, « predominantly
conservative part of Middle Americ. I flt that there
had to be some great resources for young people of
color their age, if only T could locate them. I quickly
tured to the computer I used for my research (Iwas
‘pursuing doctoral studies atthe time), but I didnot let
‘the group of girls gather around me just yet. 1 opened
‘up Google to enter in search terms that would reflect
their interests, demographies, and information needs,
but I liked to presereen and anticipate what could be
found on the web, in order to prepare for what might
be in store. What came back from that simple,
seemingly innocuous search was again nothing short
of shocking: withthe girls just afew feet away giggling
and snorting at their own jokes, I again retrieved a
Google Search results page filed with porn when 1
looked for “black girs.” By then, I thought that my
‘oven search history and engagement with a lot of Black
feminist texts, videos, and books on my laptop would
hhave shifted the kinds of results 1 would get. 1t had
not. In intending to help the girls search for
information about themselves, 1 had almost
{inadvertently exposed them to one of the most graphic
and over illustrations of what the advertisers already
‘thought about them: Black gris were sil the fodder of
pom sites, dehumanizing them as commodities, asproducts and as objects of sexual gratifieation, I closed
the laptop and redirected our attention to fun things
‘we might do, such as see a movie down the street. This
best information, as listed by rank in the search
results, was certainly not the best information for me
‘or for the children love. For whom, then, was this the
best information, and who decides? What were the
profit and other motives driving this information to
‘the top of the results? How had the notion of neutrality
{in information ranking and retrieval gone so sideways
as to be perhaps one of the worst examples of racist
and sexist classification of Black women in the dig
‘age yet remain so unexamined and without public
critique? That moment, I began in earnest a series of
research inquiries that are central to this book.
Of course, upon reflection, I realized that I had been
using the web and search tools long before the
encounters I experienced just out of view of my young
family members, It was just as troubling to realize that
Thad undoubtedly been confronted withthe same type
of results before but had learned, or been trained, to
somehow become inured tot to take it asa given that
any search I might perform using keywords connected
tomy physical self and identity could return
pornographic and otherwise disturbing results. Why
‘was this the bargain into which I had tacitly entered
‘with digital information tools? And who among us did
not have to bargain inthis way? As a Black woman
srowing up in the late twentieth century, 1 also knew
that the presentation of Black women and gol tht 1
discovered in my search resulls was not @ new
development of the digital age. I could see the
connection between search results and tropes of
‘Arcan Americans that are as old and endemic to the
United States asthe history of the country itself. My
background asa student and scholar of Black studies
and Black history, combined with my doctoral studies
inthe politcal economy of digital information aligned
with my righteous indignation for Black gies
everywhere. [searched on‘igure .2. First page of search results on keywords
“black girls,” September 18, 201.
Figure. Frst page of image search results on
keywords “black girls,” April 3, 2014
Figure 1.4. Google autosuggest results when searching,
the phrase “why are black people so,” January 25,
2013.Figure 1.5. Google autosuggest results when searching
the phrase “why are black women so,” January 25,
2013.
Figure 1.6. Google autosuggest results when searching
the phrase “why are white women so," January 25,
2013,
OT - ME BONG fice £
(Rt 2 emia é
ite ary
Figure 1.7. Google Images results when searching the
concept “beautiful” (didnot include the word
women"), December 4, 2014.
Figure 1.8. Google Images results when searching the
concept “ugly” (did not include the word “women"),
January 5, 2033,dot Tae A
Figure 1.9. Google Images results when searching the
phrase “professor style” while logged in as myself,
September 15, 2015.
‘What each ofthese searches represents are Google's
algorithmic conceptualizations of a variety of people
and ideas. Whether looking for autosuggestions or
answers to various questions or looking for notions
about what is beautiful or what @ professor may look
like (which does not account for people who look like
‘me who are part of the professoriate—so much for
personalization"), Google's dominant narratives
reflect the kinds of hegemonic frameworks and
notions that are often resisted by women and people of.
color. Interrogating what advertising companies serve
up as credible information must happen, rather than
hhave a public instantly gratified with stereotypes in
three -hundredths ofa second or less,
In realty, information monopolies such as Google
have the ability to prioritize web search results on the
basis of a variety of topics, such as promoting their
con business interests over those of competitors oF
smaller companies that are less profitable advertising
clients than larger multinational corporations are. In
this case, the clicks of users, coupled with the
‘commercial processes that allow paid advertising to be
prioritized in search results, mean that representations
‘of women are ranked on a search engine page in ways
‘that underscore women's historical and contemporary
lack of status in society-a direct mapping of old media
traditions into new media architecture. Problematic
representations and biases in classifications are not‘new, Critieal library and information seience scholars
hhave well documented the ways in which some groups
fare more vulnerable than others to misrepresentation
‘and misclassification. They have conducted extensive
‘and important critiques of library cataloging systems
and information onganization patterns that
demonstrate how women, Black people, Asian
Americans, Jewish people, or the Roma, as “the other,”
‘have all suffered from the insults of misrepresentation
and derision in the Library of Congress Subject
Headings (LCSH) or through the Dewey Decimal
‘System, At the same time, other scholars underscore
the myriad ways that social values around race and
gender are directly reflected in technology design.
‘Their contributions have made it possible for me to
think about the ways that race and gender are
‘embedded in Google's search engine and to have the
courage to raise critiques of one of the most beloved
and revered contemporary brands.
Search happens ina highly commercial
environment, and a variety of processes shape what
cean be found; these results are then normalized as
believable and often presented as factual. The
associate professor of sociology at Arizona State
University and former president of the Assocation of
Internet Researchers Alex Halavais points to the way’
‘that heavily used technological artifacts such as the
search engine have become such & normative part of
‘our experience with digital technology and computers
that they socialize us into believing that these artifacts
‘must therefore also provide access to credible,
accurate information that is depoitcized and neutral:
‘Those assumptions are dangerously flawed:
‘unpacking the black box ofthe search
‘engine is something of interest not only to
technologists and marketers, butte anyone
‘who wants to understand how we make sense
‘ofa newly networked world Search engines
hhave come to play a central role in corralling
and controlling the ever-growing sea of
information that is available to us, and yet
‘they are trusted more readily than they ought
tobe. They freely provide, it seems, a sorting
‘ofthe wheat from the chaff, and answer our
‘most profound and most trivial questions.
‘They have become an object of faith.»
Unlike the human-labor curation processes of the
early Internet that led to the creation of online
directories such as Lyeos and Yahoo!, in the current
Internet environment, information access has been left
to the complex algorithms of machines to make
selections and prioritize results for users, T agree withHalavais, and his is an important eritique of search
engines as a window into our own desires, which can
hhave an impact on the values of society. Search is «
symbiotie process that both informs and is informed in
part by users. Halavais suggests that every user of a
search engine should know how the system works,
hhow information is collected, aggregated, and
accessed. To achieve this vision, the publie would have
to have a high degree of computer programming
literacy to engage deeply in the design and output of
search.
‘Alternatively, I drav an analogy that one need not
know the mechanism of radio transmission or
television spectrum or how to build a cathode ray tube
in order to critique racist or sexist depictions in song
lyrics played on the radio or shown in a film or
television show, Without a doubt, the publie is
‘unaware and must have significantly more algorithmic
literacy. Since all ofthe platforms I interrogate in this
book are proprietary, even if we had algorithmic
literacy, we still could not intervene in these private,
corporate platforms.
‘To be specific, knowledge of the technical aspects of
search and retrieval, in terms of critiquing the
computer programming code that underlies the
systems, is absolutely necessary to have a profound
‘Impact on these systems. Interventions such as Black
Girls Code, an organization focused on teaching
young, African American girls to program, is the kind
‘of intervention we see building in response to the ways
Black women have been locked out of Silicon Valley
venture capital and broader participation.
Simultaneously, it is important for the public,
particularly people who are marginalized—such as
‘women and girls and people of color—to be eritical of
the results that purport to represent them in the first
ten to twenty results in a commercial search engine.
‘They do not have the economic, political, and soctal
capital to withstand the consequences of
misrepresentation, Ifone holds alot of power, one ean
withstand or butfer misrepresentation at a group level
and often at the individual level. Marginalized and
‘oppressed people are linked to the status of their
group and are less likely to be afforded individual
status and insulation from the experiences of the
sroups with which they are identified. The political
nature of search demonstrates how algorithms are &
fundamental invention of computer scientists who are
‘human beings~and code is language full of meaning
and applied in varying ways to different types of
information. Certainly, women and people of eolor
could benefit tremendously from becoming
programmers and building alternative search engines
that are less disturbing and that reflect and prioritize a‘wider range of informational needs and perspectives.
‘There is an important and growing movement of
scholars raising concerns. Helen Nissenbaum, a
professor of media, culture, and communication and
‘computer science at New York University, has written
with Lucas Introna, a professor of organization,
technology, and ethics at the Lancaster University
‘Management School, about hovr search engines bias
Information toward the most powerful online. Their
‘work was corroborated by Alejandro Diaz, who wrote
his dissertation at Stanford on sociopolitical bias in
Google's products. Kate Crawford and Tarleton
Gillespie, two researchers at Microsoft Research New
England, have written extensively about algorithmic
bias, and Crawford recently coorganized a summit
with the White House and New York University for
academies, industry, and aetivists concerned with the
social impact of artificial intelligence in society. At that
‘meeting, T participated in a working group on
antfcil-intelligence social inequality, where
‘tremendous concern was raised about deep-machine-
learning projects and software applications, including
concer about furthering social injustice and
structural racism. In attendance was the joualist
‘Julia Angwin, one of the investigators ofthe breaking.
story about courtroom sentencing
Northpointe, used for risk assessment by judges to
determine the alleged future criminality of,
defendants. She and her colleagues determined that
this type of artificial intelligence miserably
‘mispredicted future criminal activity and led to the
‘overincarceration of Black defendants. Conversely, the
reporters found it was much more likely to predict that
White criminals would not offend again, despite the
data showing that this was not at all accurate. Sitting
next to me was Cathy O'Nell, a data scientist and the
author ofthe book Weapons of Math Destruction, who
thas an insider's view of the way that math and big data
are directly implicated in the financial and housing
crisis of 2008 (which, incidentally, destroyed more
African American wealth than any other event in the
United States, save for not compensating Affican
Americans for three hundred years of forced
enslavement). Her view from Wal Street was telling:
‘The math-powered applications powering the
data economy were based on choices made by
fallible human beings. Some ofthese choices
‘were no doubt made with the best intentions.
‘Nevertheless, many of these models encoded
‘human prejudice, misunderstanding, and
bias into the software systems that
increasingly managed our lives. Like gods,
these mathematical models were opaque,‘their workings invisible to all but the highest
priests in their domain: mathematicians and
‘computer scientists, Their verdiets, even
‘when wrong or harmful, were beyond dispute
‘or appeal. And they tended to punish the
‘poor and the oppressed in our society, while
‘making the rich richer.
(Our work, each of us, in our respective way, is about
interrogating the many ways that data and computing
have become so profoundly their own “truth” that even
in the face of evidence, the public still struggles to hold
tech companies accountable for the products and
errors oftheir ways. These errors increasingly lead to
racial and gender profling, misrepresentation, and
‘even economic reining.
At the core of my argument is the way in which
Google biases search to its own economie interests—
for its profitability and to bolster its market
dominance at any expense. Many scholars are working
to illuminate the ways in which users trade their
privacy, personal information, and immaterial labor
for “free” tools and services offered by Google (e.g.
search engine, Gmail, Google Scholar, YouTube) while
the company profits from data mining its users.
Recent research on Google by Siva Vaidhyanathan,
professor of media studies at the University of
Virginia, who has written one of the most important
books on Google to date, demonstrates its dominance
‘over the information landseape and forms the basis of
4 central theme in this research, Prank Pasquale, @
professor of law at the University of Maryland, has
also forewarned of the increasing levels of control that
algorithms have over the many decisions made about
us, from credit to dating options, and how difficult it is
{to intervene in their discriminatory effects, The
political economic ertique of Google by Elad Segev, a
senior lecturer of media and communication in the
Department of Communication at Tel Aviv Unive
charges that we can no longer ignore the global
dominance of Google and the implications of its power
{in furthering digital inequality, particulary as it serves
as.asite of fostering global economic divides.
However, what is missing from the extant work on
Google is an intersectional power analysis that
accounts for the ways in which marginalized people
are exponentially harmed by Google. Since I began
‘writing this book, Googles parent company, Alphabet,
hhas expanded its power into drone technology,
miltary-grade robotics fiber networks, and behavioral
surveillance technologies such as Nest and Google
Glass: These are just several of many entry points to
thinking about the implications of artificial
intelligence as a human rights issue. We need to beconcerned about not only how ideas and people are
represented but also the ethies of whether robots and
‘other forms of automated decision making can end @
life, as in the ease of drones and automated weapons.
‘To whom do we appeal? What bodies govern artificial
intelligence, and where does the public raise issues ot
lodge complaints with national and international
courts? These questions have yet to be fully answered.
In the midst of Google's expansion, Google Search is
fone of the most underesamined areas of consumer
protection policy.» and regulation has been far less
ssuccessfil in the United States than in the European
Union. A key aspect of generating policy that protects
the public is the accumulation of research about the
impact of what an unregulated commercial
{information space does to vulnerable populations. I do
this by taking a deep look ata snapshot of the web, at a
specific moment in time, and interpreting the results
against the history of race and gender in the US. This
is only one of many angles that could be taken wp, but
[find it to be one of the most compelling ways to show
hhow data is biased and perpetuates racism and sexism.
‘The problems of big data go deeper than
‘misrepresentation, for sure, They include decision-
‘making protocols that favor corporate elites and the
‘powerful, and they are implicated in global economic
and social inequality. Deep machine learning, which is
using algorithms to replieate human thinking, is
predicated on specific values from specific kinds of
people-namely, the most powerful institutions i
society and those who control them, Diana Ascher in
her dissertation on yellow journalism and cultural
time orientation in the Department of Information
Studies at UCLA, found there was a stark difference
Detween headlines generated by social media
‘managers from the LA Times and those provided by
automated, algorithmically driven software, which
generated severe backlash on Twitter. In this ease,
Ascher found that automated tweets in news media
‘were more likely to be racist and misrepresentative, as
in the ease of police shooting vietim Keith Lamont
Scott of Charlotte, North Carolina, whose murder
triggered nationwide protests of police brutality and
excessive free,
‘There are many such examples, In the ensuing
chapters, I continue to probe the results that are
generated by Google on a variety of keyword
‘combinations relating to racial and gender identity as
a way of engaging a commonsense understanding of
hhow power works, with the goal of changing these
processes of control. By seeing and discussing these
{ntersectional power relations, we have a significant
‘opportunity to transform the consciousness embedded
{in artificial intelligence, since it is tn fact, In part, aproduct of our own collective eeation,
EA Los Angle Times ©
keith Lamont Scott had a complicated past:
S arrests, prison, 20 years of marriage, a
‘900d review at work
0003000
Figure 1.10. Automated headline generated by
software and tweeted about Keith Lamont Seott, killed
by police in North Carolina on September 20, 2016, a8
reported by the Los Angeles Times.
Theorizing Search: A
Black Feminist Project
‘The impetus for my work comes from theorizing
Internet search results from a Black feminist
perspective; that is, ask questions about the structure
and results of web searches from the standpoint of a
Black woman—a standpoint that drives me to ask
different questions than have been previously posed
bout how Google Search works. This study
previous research that looks at the ways in. which
racialization is a salient factor in various engagements
with digital technology represented in video games,:
websites,» virtual worlds» and digital media
platforms. A Black feminist perspective offers an
‘opportunity to ask questions about the quality and
content of racial hierarchies and stereotyping, that
appear in results from commercial search engines such
as Google's; it contextualizes them by decentering the
dominant lenses through which results about Black
‘women and girls are interpreted. By doing this, 1 am
purposefully theorizing from a feminist perspective,
hile addressing often-overlooked aspects of race in
fominist theories of technology. The profestor
‘emeritus of science and technology at UCLA Sandra
‘Harding suggests that there is value in identifying a
feminist method and epistemology:
Feminist challenges reveal thatthe questions
that are asked —and, even more significantly,‘those that are not asked—are atleast as
determinative ofthe adequacy of our total
picture as are any answers that we ean
discover. Defining what isin need of scientific
explanation only from the perspective of
‘bourgeois, white men’s experiences leads to
partial and even perverse understandings of
social life. One distinctive feature of feminist
research is that it generates problematies
from the perspestive of women's
experiences.
Rather than assert that problematic or racist results
possible to correct, in the ways that the Google
disclaimer suggests, I believe a feminist lens, coupled
‘ith racial awareness about the intersectional aspects
of identity, offers new ground and interpretations for
‘understanding the implications of such problematic
positions about the benign instrumentality of
technologies. Black feminist ways of knowing, for
‘example, can look at searches on terms such as “black.
girls” and bring into the foreground evidence about the
historical tendencies to misrepresent Black women in
the media. Of course, these misrepresentations and
the use of big data to maintain and exacerbate social
relationships serve a powerful role in maintaining
racial and gender subjugation. It is the persistent
normalization of Black people as aberrant and
undeserving of human rights and dignity under the
‘banners of public safety, technological innovation, and
the emerging creative economy that I am directly
challenging by showing the egregious ways that