Your Child Thinks An Algorithm Is Listening To Them (And What You Should Do About It)

Does your child try to "train" their apps, content and social media feeds. Understand why they do it and help your child turn digital myths into media literacy.

Minimalist child silhouette with a glowing white geometric network inside the head. Technical vector line-art on a deep charcoal background symbolizing algorithmic thought.

If you have watched your child use a tablet or phone recently, you may have noticed some behavior that you find baffling.

Perhaps you have seen them tapping the screen in a rhythmic, specific pattern before opening a "loot box" (a digital treasure chest) in a game. Maybe you have heard them whispering specific keywords to their phone, convinced that saying "cats" three times will make cat videos appear. Or perhaps they have told you, with absolute seriousness, that if they don't share a certain video with five friends, their account will be "cursed" with bad luck.

To an adult, this looks like nonsense. It looks like they are being gullible. But in the world of computer science and sociology, this behavior has a name. It is called "Algorithmic Folklore."

Setting the Scene: The Invisible Boss

To understand why your child is acting this way, we first need to define the invisible force they are interacting with: The Algorithm.

You hear this word on the news, but for a child on YouTube, TikTok, or Instagram, the Algorithm is something very specific. It is the computer program that decides what video plays next. It is the code that decides if their own video gets seen by 100 people or 0 people. It is the system that decides if the digital chest they just bought contains a rare prize or a common one.

The crucial thing to understand is that nobody knows exactly how it works. The companies (Google, ByteDance, Roblox) keep the exact rules a secret. Because the rules are hidden, children are left to guess how to "win." They try to figure out the system by trial and error, rumors, and myths.

Looking Back to Understand Today

This might sound like a strange new obsession, but it is actually deeply familiar. We did the exact same thing.

Think back to the playground in the 1980s or 90s. Did you ever blow into a Nintendo cartridge to make it work? (Technically, the moisture from your breath could actually damage it, but we all believed it fixed the connection). Did you ever hold down a specific combination of buttons on a controller to try and catch a Pokémon? Did you ever receive a "chain letter" that promised bad luck if you didn't forward it to ten people?

Vintage game cartridge. Fine technical lines depict a geometric cloud of breath entering the base, illustrating the evolution of digital superstitions.

We didn't know how the technology or the world worked, so we invented rituals to feel like we had control. We created superstitions.

Your child is doing the same thing. But instead of blowing on cartridges, they are trying to outsmart a supercomputer.

The Evolution of the Myth (2016–2025)

When researchers first started studying this in 2016, they called it "Folk Theory" - the simple myths users created to explain why a post disappeared. But as the technology has evolved, so has the sophistication of the children using it.

New research from February 2025, published in the International Journal of Communication, suggests that children have moved beyond simple superstition. They now view themselves as "Co-Producers" of their feed.

The study found that young users no longer see the algorithm as just a passive judge. They see it as a partner they have to "train." This leads to a behavior researchers call "Algorithmic Gossip" - a term coined in recent work by scholars like Dr. Brady Robards.

Two child silhouettes in fine white lines sharing a glowing yellow data cloud of geometric nodes. A technical blueprint representing the collaborative "Algorithmic Gossip" of digital secrets.

"Algorithmic Gossip" is when children share theories on how to manipulate the machine For example:

  • "If you pause on a video for exactly 3 seconds, the algorithm thinks you liked it."
  • "If you use this specific sound clip, the algorithm will boost you to the 'For You' page."

They are not just being superstitious; they are trying to reverse-engineer the code. They believe that if they perform the right "rituals," they can force the computer to give them what they want.

A child’s profile in fine lines being poured through a technical sieve, with only rigid cubes falling out. Represents algorithmic filtering of personality.

The Pressure of the Identity Filter

This interaction has a more troubling dimension. Throughout 2024 and 2025, researchers identified a phenomenon known as the "Identity Strainer". This refers to the anxious belief held by children that the computer code acts as a selective filter for their personality.

To understand this, we must look at the "Feed." This is the personalized, ever-scrolling list of videos, images, and posts that appears when your child opens an app like TikTok or Instagram. The algorithm (the underlying math) decides which pieces of content are "worthy" of appearing in that feed.

When a child posts a video about a personal interest - such as a specific hobby or a joke - and it receives very few views, they often conclude that the system has judged that specific part of their identity as "incorrect" or "unpopular". This leads to a persistent fear of "Shadowbanning," where a user believes the platform has secretly hidden their content from the world without notification.

The result is a distinct form of digital anxiety. Children begin to self-edit, removing any traits or interests they suspect the algorithm might dislike. They are no longer simply sharing their lives; they are performing for a silent, invisible judge in an attempt to regain control over their own visibility

Thin-line silhouette of a child before a mirror reflecting a blank digital grid. A conceptual architectural illustration of the fear of being hidden by code.

This is where the baffling behavior comes from. All those rituals you see - the tapping, the whispering, the sharing with five friends - are not just games. They are desperate attempts to appease this judge. When your child taps the screen rhythmically, they are trying to prove they are "good" so the algorithm won't filter them out. They are trying to regain control over which parts of their identity get to be seen.

How to Handle the Superstition

The instinct is to tell them, "That’s not how computers work; stop being silly." But that shuts down the conversation. Instead, use this as a moment to teach them about how technology actually functions.

Here are three questions to help demystify the machine, updated for this new "co-production" era:

  • The "Training" Question: "You mentioned you have to watch that video to make the app happy. How do you think you are training the algorithm right now? What are you teaching it to show you tomorrow?" (This reframes them as the teacher, not the servant).
  • The "Testing" Question: "You think tapping the screen helps you get the rare item? That’s a cool theory. Let's test it like scientists. Let's try it 10 times without tapping and 10 times with tapping, and write down what happens."
  • The Empathy Bridge: "You know, when I was your age, we used to believe that if you said 'Bloody Mary' into a mirror, a ghost would appear. It feels the same, doesn't it? It's fun to believe there are secret rules."

Your child is trying to find patterns in a chaotic digital world. That is a sign of intelligence, not stupidity. By guiding them gently, you can turn their "folklore" into "media literacy" - helping them see the reality behind the computer curtain.

References & Further Reading

2025, Felaco, C., Making Sense of Algorithm: Exploring TikTok Users’ Awareness of Content Recommendation, International Journal of Communication

2024, Robards, B., et al.,  Algorithmic gossip in young people's accounts of 'unhealthy' advertising on social media, AoIR Selected Papers of Internet Research

2016, Eslami, M., et al., First I "like" it, then I hide it: Folk Theories of Social Feeds, Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems

2019, Bishop, S., Managing visibility on YouTube through algorithmic gossip, New Media & Society

Human and wireframe hands weaving a glowing yellow thread representing a video timeline. Symbolic of the child-algorithm partnership in the co-production era.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to The Inquisitive Parent.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.