Page 3 of 5 — The harm

The algorithm
doesn't see
you — it sorts you

Safiya Umoja Noble's research shows that when profit is the algorithm's only value, certain bodies and identities become the most monetizable — and the most misrepresented.

Data as product means people as stereotype

We already established that users are the product — that behavioral data is collected, profiled, and sold to advertisers. Noble's contribution is to ask: what happens when that logic meets race and identity? Her answer: the algorithm doesn't just profile people. It misrepresents them — and it does so profitably.

What page 2 showed us

Platforms optimize for engagement. Content that generates the strongest behavioral signals — clicks, watches, shares — gets surfaced. The algorithm serves what performs, not what is true or fair.

What Noble adds

When "what performs" is shaped by a society with existing racial hierarchies, the algorithm doesn't neutralize those hierarchies — it amplifies them. The most profitable representations of marginalized groups are often the most harmful ones.

The mechanism

Advertisers pay for audiences. The more clicks a search result or piece of content generates, the more valuable it is. Sexualized or stereotyped content about Black women, for example, generated more ad revenue — so it got surfaced more. The market rewarded the misrepresentation.

The implication

The algorithm isn't neutral. It is a market. And markets reflect the values — and the prejudices — of the people who built them and the advertisers who fund them. "Objective" results are never objective; they are the output of economic incentives operating on social data.

Noble's research documented search engine results that consistently returned sexualized, criminal, or otherwise degrading images and content when users searched for Black women and girls — results that were neither accidental nor inevitable, but the direct product of an ad-revenue model that treated those representations as high-performing inventory. The platform didn't intend racism. It just priced it.

The algorithm as gatekeeper

Academic context — required reading

The examples below are drawn directly from Safiya Umoja Noble's Algorithms of Oppression, Chapter 2 (2018). Noble documented these search results as part of her scholarly research into how commercial search engines misrepresent Black women and girls. They are presented here in an academic context to illustrate her argument — not to reproduce harm, but to name it clearly, as Noble herself does. Her research changed how scholars, policymakers, and technologists think about algorithmic bias.

Noble draws a direct parallel to historical redlining — the practice of banks and real estate agencies systematically denying loans and housing to people based on race. Digital redlining works differently but produces the same structural effect: algorithms sort identities into categories defined by commercial value, and those categories consistently disadvantage communities of color. The result is that Black women and girls, in Noble's documented research, were more likely to be represented through sexualization and stereotype than through achievement, history, or humanity — because sexualization and stereotype were more profitable.

The demo below walks through Noble's three core search examples from Chapter 2. In each case, ask: who benefits from what gets surfaced? And whose identity is being defined by someone else's ad budget?

Search: "Black girls"
Search: "Black women"
Search: "professional hair"

Search query: "Black girls" — documented by Noble, 2010–2011

1

Adult entertainment sites — sexualized content

adult-content.example.com

Highest ad revenue in category · Surfaced by click-through rate

2

Adult entertainment sites — sexualized content

adult-content.example.com

Multiple high-revenue advertisers in this category

3

Stereotyped content — crime and poverty framing

news-aggregator.example.com

High emotional engagement · Outrage-driven click-through

9+

Educational resources — Black girls' achievement, mentorship, scholarship

nonprofit.org / education.example.com

No advertiser · Low commercial value · Effectively invisible

This is Noble's central finding. When she searched "Black girls" in 2010, the first page was dominated by sexualized content — not because of a malicious programmer, but because that content generated the most ad revenue. The algorithm had no mechanism for asking whether the result was harmful. It only asked whether it was profitable. Educational content about Black girls' achievement appeared on page 9 or later — not because it was less true, but because it was worth less to advertisers. Noble argues this is not a bug. It is the system working exactly as designed, and the design is the problem.

Search query: "Black women" — documented by Noble, 2010–2011

1

Adult entertainment sites — sexualized content

adult-content.example.com

Highest advertiser spend in category · Keyword match

2

Stereotyped imagery — controlling narratives about Black womanhood

content-farm.example.com

High engagement · Reinforces existing stereotypes · Ad-supported

8+

Black women in history, leadership, literature, public life

various educational sources

Fragmented advertiser pool · Low CPM · Buried

Noble's research showed that "Black women" returned results overwhelmingly shaped by sexualization and stereotype — while searches for "white women" returned fashion, lifestyle, and professional content. The disparity is not random. It reflects which representations of womanhood have been funded by advertisers, and which have not. The algorithm doesn't create this hierarchy — but it amplifies and normalizes it at enormous scale, for every user who searches.

Search query: "professional hair" — documented by Noble, 2010–2011

1

Straight, blonde styles — "office appropriate" framing

beauty.example.com

Dominant beauty advertiser spend · Eurocentric standard encoded

2

Blow-out tutorials — fine, straight hair only

youtube.example.com

High watch time · Algorithm rewards retention · Excludes natural hair

7+

Natural hair styles for Black women in professional settings

naturalhair.example.com

Smaller advertiser base · Undermonetized · Lower in results

The algorithm doesn't decide what "professional" means — but it inherits and reinforces whoever decided first. When beauty advertisers overwhelmingly fund content representing one standard, that standard dominates results. Natural Black hair styles are not less professional — they are less monetized. Noble connects this directly to workplace discrimination: the same logic that buries these results from search also shapes what employers see when they imagine a "professional" appearance. The CROWN Act was passed in several U.S. states specifically because this bias has real legal consequences for Black workers.

Redlining doesn't just surface stereotypes — it buries identities

Digital redlining operates in two directions. The algorithm doesn't only surface harmful content — it also suppresses content that advertisers deem "high-risk" or "brand-unsafe." Shadowbanning is the practice of silently limiting the reach of certain content without notifying the creator. The post stays up. It just stops traveling.

Documented patterns show that content from LGBTQ+ creators, Black creators discussing race, and communities posting in languages other than English have disproportionately experienced reduced reach — not because of explicit policy violations, but because their content triggers advertiser risk filters or lower-CPM audience categories. The algorithm doesn't ban them. It makes them quieter.

What gets amplified

Brand-safe lifestyle content
High-CPM audience demographics
Content that doesn't name race, queerness, or disability
Emotionally activating but "neutral" content
English-language content in US/EU markets

What gets suppressed

LGBTQ+ identity content flagged as "sensitive"
Racial justice discussions — advertiser risk
Sex education content from health educators
Disability visibility content — low CPM category
Non-English content in low-ad-spend markets

This is the link Noble's framework makes possible: platform capitalism doesn't just fail marginalized communities by accident. It fails them structurally, because the profit motive consistently assigns lower value to their identities, their stories, and their presence. The algorithm isn't bigoted — but the market it serves often is. And the algorithm has no mechanism for caring about the difference.

Up next

Instagram, YouTube, Spotify — up close

Noble's lens applied to each platform — who gets surfaced, who gets buried, and why.