Page 2 of 5 — The mechanism

How the
machine
works

Recommendation algorithms aren't magic — they're an economic system. Here's exactly what they're doing every time you open an app.

What the algorithm reads

Every action is a signal

You don't fill out a form telling Instagram what you like. You don't submit a preference survey to YouTube. The algorithm doesn't need you to. It reads your behavior directly — and behavior is far more honest than anything you'd say out loud.

Each signal is weighted, timestamped, and fed into a model that updates continuously. Try it below — pick how you engage and watch the algorithm build your profile in real time.

Be the algorithm

Select the actions you take while scrolling. Watch what the algorithm learns about you — and how quickly the loop tightens.

Step 1 — Pick your actions

Step 2 — Your behavioral profile

No signals yet. Pick an action above.

Step 3 — Loop tightness

0%

Select signals to see the loop activate

Step 4 — What it serves you next

The algorithm is waiting for your signals...

The numbers behind the loop

The feedback loop isn't theoretical — it produces measurable outcomes. These figures reflect what happens when an algorithm optimizes purely for engagement time, without regard for what that time costs the user.

70%

YouTube watch time

of what people watch on YouTube is recommended by the algorithm — not searched for or chosen directly.

2.3×

Engagement multiplier

Content that triggers an emotional reaction — outrage, anxiety, excitement — generates roughly 2–3× more engagement signals than neutral content.

The scroll

Infinite scroll was deliberately designed to remove stopping cues. There is no natural end point — the algorithm keeps serving, and the feed never runs out.

This is the core tension of the attention economy: what is most profitable for the platform is not necessarily what is most beneficial for the user. An algorithm optimizing for watch time will surface content that is emotionally activating, habit-forming, and difficult to disengage from — not because it values your wellbeing, but because those qualities produce more data and more ad revenue. The goal was never to serve you. It was to keep you.

Up next

Who gets hurt by the loop

Safiya Noble's research on what happens when the algorithm's profit motive meets race, identity, and power.