Never miss a show! | ![]() |
Email Signup | ![]() |
Spotify | ![]() |
RSS Feed | ![]() |
Apple Podcasts |

Over a faded circuit board pattern and binary code, three facial recognition face models. (Image by Lissa Deonarain)
On this week’s episode, we dive into the hidden biases of the digital age with Dr. Safiya Umoja Noble, author of the groundbreaking book, Algorithms of Oppression. Dr. Noble unpacks how search engines, often seen as neutral tools, can reinforce harmful stereotypes and limit access to critical knowledge. Join us as we explore the forces shaping our digital experiences and discuss the urgent need for accountability in technology.
Featuring:
Dr. Safiya U. Noble is the David O. Sears Presidential Endowed Chair of Social Sciences and Professor of Gender Studies, African American Studies, and Information Studies at the University of California, Los Angeles (UCLA). She is the Director of the Center on Race & Digital Justice and Co-Director of the Minderoo Initiative on Tech & Power at the UCLA Center for Critical Internet Inquiry (C2i2). She currently serves as Interim Director of the UCLA DataX Initiative, leading work in critical data studies for the campus.
Music:
– Xylo-Ziko – Phase 2
– Audiobinger – The Garden State
Making Contact Team:
– Host: Anita Johnson
– Producers: Anita Johnson, Salima Hamirani, Amy Gastelum, and Lucy Kang
– Executive Director: Jina Chung
– Engineer: [Jeff Emtman](http://www.jeffemtman.com/)
– Digital Media Marketing: Lissa Deonarain
Watch Dr. Noble discuss the themes of her book in this lectureMore Information:
TRANSCRIPT
Anita Johnson: This is Making Contact and I’m your host for the week, Anita Johnson.
In today’s digital world, search engines are often seen as neutral tools that simply help us find information. But what if the results we get from these platforms aren’t as unbiased as we think? What if they are shaped by hidden forces, reinforcing harmful stereotypes and limiting access to knowledge?
This is the premise of Dr. Safiya Umoja Noble’s groundbreaking book, Algorithms of Oppression. In her work, Dr. Noble examines how search engines, particularly Google, perpetuate biases that disproportionately affect marginalized groups — especially Black women. Her research challenges the idea that technology is a neutral force and instead reveals how it can reproduce and amplify systems of oppression.
I began my conversation with Dr. Noble by asking, what inspired her to write Algorithms of Oppression and what was the catalyst for her research into search engines and their biases?
Dr. Safiya Umoja Noble: Well, I, like many people, when an economic recession happens found myself going back to grad school. And when I went into academia, At that time, around 2008, I was so surprised at how people in the university were talking about these new big tech platforms that were emerging. I mean, maybe they weren’t as big as they are now. This is like the early days of Facebook when it was, you needed a “.edu” email be on the platform, for example, and YouTube had just kind of taken off and exploded and people were imagining a future with that. People in academia were talking about these platforms as if they were kind of like things like the new public library or the future of how we would access knowledge and information…almost like it was a public good.
I really knew from having been in corporate America that we were using search engines in particular to make our clients more visible. And I really understood them as advertising technologies, not as the new public library. That dissonance between what I had experienced leaving industry and coming into academia was just like the place that I wanted to investigate. And I was thinking, honestly, at first, what would it mean that these tech companies would decide, let’s say, what the history of Black knowledge and experience and culture would be right that people would start to use a search engine to look for Black people in Black culture. Black history, Black knowledge, and it would be whatever the programmers decided it would be.
And that was the thing that got me. And I started doing a number of searches. I guess probably the most infamous one that I’ve done nd written about was what happened when you did keyword searches on Black girls and Black women. Back then, you got almost exclusively pornography as the representation of Black girls and women. And that really was the catalyst where I said, I need to write in detail for the public and for my colleagues in academia about what a search engine is in terms of it being an advertising technology, but more importantly, what are the stakes of turning to these kinds of technologies as a guide for the future of our kind of information ecosystem or the future of knowledge.
Anita Johnson: Backing up a bit for the listening audience, break down what algorithms are. I think most people understand search engines at this point, but what are algorithms in regards to your research and what you discovered and how they work in particular?
Dr. Noble: Really in the most technical sense, an algorithm is a set of instructions that we give to a computer to have it create a set of results for us. You could think of it like a recipe. It’s a set of instructions. That’s ultimately what algorithms do, they’re mathematical instructions, in fact, and they run every dimension of how we experience the Internet or things on our smartphones or in computers on a daily basis. So there’s an algorithm that is a set of instructions that’s helping determine the shortest route in traffic, if you’re using a GPS tool or map tool, right? In the case of 2008, 2009 2010, when I started this research, people really didn’t understand what an algorithm was. It was very much computer science jargon.
Now, of course, what we really understand is that an algorithm is driving. You could almost think of it as the set of instructions that’s going to over determine how we experience the internet. And there’s lots of different kinds of algorithms from simple to complex that we experience every time we’re using computers. But if we’re not using computers, now that so many industries, schools…landlords, you name it, are using software to make decisions, it means that all of us in one way or another are having decisions that affect our everyday lives determined by algorithms. Now, knowing that, going back to your initial search for Black women, Black girls, the first thing that would pop up would be pornography.
Anita Johnson: How particularly harmful do you think these algorithms are to certain groups or communities based on your research? And even though you did all this research for the book, years ago, has there been a change in regards to the way in which algorithms play into maybe producing a certain level of harm or misinformation targeting certain communities and groups as well?
Dr. Noble: I love that question because I have to say that we know a lot about media and how it affects us and how it affects societies and communities, different age groups, different demographics. One of the things, for example, that we know about television after more than 50 years of really good research around television is that when you experience racist or sexist or discriminatory stereotypes in something like television or film, you know that it has impact on audiences. It has negative impact on the people who are the subjects, right? So if it’s a racist algorithm toward African Americans, then those stereotypes circulating in our society and people acting upon their negative beliefs about African Americans will do things like emboldened racism or discrimination in our society.
At UCLA, we our Ralph J. Bunch Center and one of our professors, who’s also happens to be our interim chancellor right now, has led the Hollywood diversity report for more than a decade with very rich research about what happens when people are misrepresented in Hollywood. Now, I was looking at that research and I was thinking about the Internet and what’s different about the Internet when you experience derogatory images about yourself. or other people in this case, I was thinking about my daughter. At the time that I was doing this research, she was a tween and I was thinking, “Well, what happens when she is using a search engine and she’s looking for things that other Black girls are interested in and she sees all this pornography in that context, unlike a movie or a television show?”
What we’re led to believe is that a search engine is bringing back to us the most reliable, the most credible, and the most truthful information, websites, and other kinds of resources. Now, that’s a totally different context than knowing that Hollywood is making movies. Based on how much money they can make, most people who use the Internet and search engines think that they’re looking and smarter people in Silicon Valley have developed these technologies to help them wade through millions of websites to bring back the very top 20 very best ones to that first page of search result.
And that context of believing that what you find in a search engine is fair, reliable, truthful, credible, to me is so important to study and understand because what that means is that not just my daughter when she sees derogatory images about Black girls and women, but also her peers, her teachers, her employers, you know, anybody who also gets those kinds of results. And in fact, I have been at scholarly conferences in the early days of my research, where I would say, “Look at how these algorithms are programmed so negatively toward Black women,” and scholars themselves would stand up and say, “Maybe Black women just do more porn,” which of course we know is just patently false.
The strong belief that what we get in search is is very reliable is the reason why we have to interrogate what’s happening there. And of course, we see this translated to the second part of your question. In politics, when people are looking for information about political candidates or political issues, search engines are just right for misinformation, propaganda, disinformation to flow through the systems, because ultimately political action committees or candidates campaigns that have the most money can really affect what we see in the first page or the first couple of pages of search results. And so these are resources that I think we’ve been encouraged to rely upon when we know very little, in fact, generally speaking in the public about how they work. I’m so happy you said that because I mean, listening to you immediately, I was thinking misinformation in particular, because I think most people don’t really understand how this stuff is working.
And the thought initially would be, this is a computer. It’s smarter than me. But what people are failing to realize is that. It’s this computer generated information that we’re pulling up is actually being programmed by human beings. Absolutely. That’s that’s actually the most important dimension of remembering that everything we experience in the world, for the most part, I think, outside of the natural environment. And even that is increasingly manipulated by human beings, these are products of our imagination and they’re also so limited by our imagination or lack thereof. I mean, we know that there are a lot of things that Silicon Valley companies do not test for that they don’t even think about looking at. We know that there are.
They don’t hire for the most part underrepresented ethnic minorities. They don’t hire women, generally speaking — certainly not as reflected in our population. So we have a narrow, mostly homogenous type of person who’s working in Silicon Valley, who’s just not asking the kinds of questions certainly that I was asking in my research. And, you know, In the book Algorithms of Oppression, there are just dozens of examples of highly consequential problems that, uh, happen when you’re doing all kinds of different searching. And, and what we know is that since that book came out in 2018, uh, the companies work very hard to try to fix those problems.
So there’s, there are a number of different kinds of amazing effects and people have to have both made those damaging effects. Companies, unfortunately, have also profited quite a bit from the damage and yet, when it’s time to take responsibility or accountability for the harm that comes from those products, somehow, then it seems to just be articulated as, “Oh, it’s just the computer code where we’re not.” We’re not really sure what happened here.
Station ID Break: You’re listening to Dr. Noble, the author of Algorithms of Oppression on Making Contact. To learn more about this week’s episode check us out at focmedia.org. Now back to the show.
Anita Johnson: Dr. Noble reminds us that while search engines provide quick answers, they can also erode our ability to engage in the deep, critical thinking necessary to navigate complex political landscapes. This shift in how we access and process information has profound implications for democracy, shaping how we understand power, history, and the decisions we make at the ballot box. To dive deeper, I asked Dr. Noble were there specific patterns or findings that she found especially troubling about how political information is being filtered or shaped by these algorithms.
Dr. Noble: Well, one thing we know is that all of these technologies have really truncated our ability to do that deep dive or long form reading and preparation and research. I’ll tell you that it’s hard for me to find a colleague who’s a professor or a teacher who doesn’t comment at some point on how difficult it is to get students to read a lot of information or material. It’s increasingly difficult to assign long articles and books. Things that would have been so easy 10 years ago, 15 years ago, 20 years ago. We’re also living in a time where our time and ability to access other people for discussion and conversation about issues is also declining. We know, for example, there’s lower participation in organized religion. That might be a place where people would go and talk about politics or issues or things that are happening in their communities.
People are increasingly isolated, whether it’s because of things like, a global pandemic, being compromised and not being able to be out in person with as many people as before. So many, many different ways. We are losing our ability to or we’re having more time compression. We have to work more. Things are more expensive. We have less free time to socialize and go deeper and think and, and just contemplate things. So there’s no way that that kind of, that the whole ecosystem of how we live now has compressed our time and concentration and ability to think deeply about the consequences of who we’re voting for, what we’re voting on.
So I think in some ways, while we can just search for something like, “Okay, what is this proposition about?” Or “What is this candidate about?” Or “Should I even vote or not?” In some ways, we feel like we have more access than ever. We also, in other ways, have kind of lost other types of access or other types of knowledge and thinking and time. We may have more information at our fingertips than we ever had before, but it is also increasingly difficult to make sense of what’s fact, what’s fiction, what we can count on and how it’s going to affect our lives.
Anita Johnson: In an era where misinformation spreads with a single click and voting rights remain under attack, the stakes for civic participation have never been higher. I turned to Dr. Noble for her insights on how voter suppression tactics are evolving—and what that means for Black communities.
Dr. Noble: We want to remember that since the 2000 before the 2016 presidential election, Black people in particular, specifically have been targeted with propaganda disinformation to effectively discourage our political participation. We have people who are paid actors, paid influencers to discourage our participation in electoral politics to encourage us to withhold our vote.
We’ve also had massive, massive voter disenfranchisement, purging of millions of African Americans off the voting rolls. That is also part of what you and I experience in our families and in our communities, where people are like, “I don’t even know…this party is taking my vote for granted and I’m going to withhold until they do right.”Those are voter suppression tactics that people may not even be aware. Very smart people are also vulnerable to that kind of thing. So I just think, we have to get more sophisticated and nuanced ourselves about what it means to take away our right to vote, even by convincing us that there’s nobody good to vote for. And there also may not be somebody good to vote for.
Anita Johnson: Algorithms that lift up misinformation can fuel apathy among Black voters — using psychological and social strategies to discourage political participation. Propaganda may amplify the belief that voting doesn’t lead to meaningful change in Black communities, by highlighting systemic failures and unfulfilled promises. This overloading of negative messaging can also lead to serious emotional exhaustion and voter fatigue, making individuals feel overwhelmed and powerless to make a difference. But checking out of the political process is a slap in the face to those who fought hard for the right to vote.
Dr. Noble: Some of us are old enough to remember the stories of our own parents not being able to vote. For people, let’s say Generation X and older, maybe really understand the sacrifices that were made by the generation before us to have the right to vote. And it’s not like they had a lot of incredible choices because after the turn of the century, after the Civil War, when Black men had the right to vote, there were more Black mayors in the United States than we’ve ever had in the history of this country, and you had massive Black voter participation coming kind of post reconstruction. And that was met in the 1920s with basically the reign of terror, the rise of the Ku Klux Klan, the perverse and profane lynchings that took place all over this country and again, tactics to suppress and roll back Black people’s voting participation and rights.
And it was our parents and Black women who were finally given the right to vote with the passage of the 1964 Voting Rights Act. So when young people say we have a right to not participate, that’s true. But when I think about how much blood has been spilled in this country over the profound impact that Black people could have when we did vote, then that’s another way of thinking about just what it means to withhold a vote.
It’s not just symbolic, and it’s not just about representation. It’s that when Black people vote, we actually control the electorate in a lot of places where we live and can affect incredible change and empowerment. So, that’s another way to think about what it means to opt out.
Anita Johnson: Thank you so much. That was very thoughtful. I’m hoping it resonates with some folks in a way that is more progressive. But, given the power of search engines and shaping public discourse, what are some solutions or interventions that could help mitigate the negative impacts of search engines on the electoral process?
Dr. Noble: Well, one thing we need are more search engines, not a handful of monopolies that really control the information landscape. And there are people who are trying about, what are ways that you could access information that is more aligned with your values rather than more aligned with what’s most profitable for the companies to show you? I’ve always been a proponent since I started this research of more search engines that are specialized, that really help us to parse, deep and difficult information like voting information at a local level.
I remember even in this last election, sitting with my family and are trying to deeply investigate people and propositions and ballot initiatives. And it was very difficult. And in the old days, we used to have the progressive voter guides and even those are hard to find and work your way through.
So more education, more expertise that can be made visible by different kinds of technologies is is far better than just those with the most money being able to surface. And of course I’m an educator. I am professor so I believe there should be deep information and education no matter what your politics are.
Anita Johnson: Last question. What can individuals and communities do to push back against algorithmic oppression based on your research and working with, you know, to be fact at this point?
Dr. Noble: One of the most powerful inoculations, I think, for the public is for them to understand what these systems are, how they work. Of course, my book is just one for Ray and to understanding and that that book is very accessible. It’s not written so that you have to have a PhD to understand it. I promise! But knowledge is a very powerful inoculator. It helps you to slow down, think more critically, have conversations with people and experts that you trust.
People who have knowledge that I don’t just mean credentialed experts. I mean, people with lived experience, deeper knowledge than you do about something. All of those are different forms of inoculation, not just trusting everything that you come into contact with online or even the systems that you’re using. I think people can use their, their voice and their power to support libraries and librarians. Quite frankly, those are great places to go to, to continue to build our knowledge and our own stores of wisdom. think we want to also do everything we can to kind of protect access to education. Increase access to education. If we just leave our country and our culture to Silicon Valley to decide and to shape, we’re really gonna lose a lot. We’re already losing a lot of knowledge about how to do things.
And even just taking up…we think of this as I don’t know, maybe parochial? But I will tell you that knowing how to do things is very valuable. You know, at the end of the day, what happens when the power grid goes down? And you can’t use your phone and you can’t use the internet and you can’t use the search engine. Do you know how to do things? Are you in community with others who know how to do things? All of that to me is part of the inoculation that we can control, at least in our own personhood. Of course, we can also, as workers, organize on our jobs and respond to more encroachments of technology companies and impacting how our work is changing and the quality of our work is changing companies are shifting a lot of money and budget toward tech and that budget is coming away from workers and workers wages.
You don’t have to be a technologist to care about what’s happening with this sector and how it’s affecting your life. And there are many places where you can have a voice and push back and, uh, and do something different.
Anita Johnson: Well, Dr. Noble, this has been a great conversation really about digital literacy.So again, thank you so much. And especially the point that you mentioned about us being a community. When we think about the next four years, that’s going to be more important and significant than ever. You know, who do we go to? How do we access information resources? Because considering what’s been already said, and he’s been proven to do it, you know, if he says something, he most likely will do it. So I just think everything you said and then considering how algorithms work and search engines and technology. I couldn’t be more happy and pleased that, uh, I had this conversation with you. So thank you. Thank you. Thank you again so much. Thank you so much. And thank you for your work. I will tell you that journalism is one of the most important, uh, resources that we have, and I’m just so grateful for you and your work.
Music Transition
Anita Johnson: I’m Anita Johnson. And this is Making Contact. I spoke with Dr. Safiya Noble, Professor of Gender Studies, African American Studies, and Information Studies at the University of California, Los Angeles (UCLA). Professor Noble is the author of the best-selling book on racist and sexist algorithmic harm in commercial search engines, titled Algorithms of Oppression: How Search Engines Reinforce Racism
If you want to check out the book visit us at FoCmedia.org that’s F.o.C.media.org. Thanks for listening to Making Contact.