Filter Bubbles: Why Growth Is So Hard

“A filter on Instagram was like if Twitter had a button to make you more clever.” ― Sarah Frier

MENTAL MODEL

man in gray hoodie holding black nikon dslr camera
man in gray hoodie holding black nikon dslr camera

A filter bubble is a state of intellectual isolation, a result of personalized searches, recommendation systems, and algorithms. Our browsing and searching history is pulled, taking everything from what links we click to what movies we put in our queue and what news stories we read. Thus a consumer profile is built on us, and we receive personalized content that places us in a “bubble” of information, outside of which we do not see.

This process is not random, nor is it a secret. Data is gathered on us. Search engines know who we are and what we like. They provide us with products, services, and results that best fit us. Each new search we make and website we browse adds to that backdrop of information—the consumer profile—fine-tuning the model until it curates the media we see to “hit the spot”. As of 2011, an engineer said Google used 57 different pieces of data—location, type of computer, gender, age—to tailor search results. That was 2011.

You can test this quite easily. Google “bottle”. Now ask a friend, family member, or co-worker to search “bottle”. Look at how different the results are. There is going to be a bit of overlap, but one might get sent to environment saving initiatives protesting against plastic bottles whilst another could be prompted to order a thermal bottle for their next trip to the mountains. Filter bubbles close us off to new ideas, subjects, and important information. They create the impression that your narrow self-interests are all that exist. The one who coined this term, Eli Pariser, criticizes Google for giving users “too much candy and too little carrots.”

Perhaps the worst part is how little awareness there is of filter bubbles. According to an article in The Guardian, more than 60 percent of Facebook users are unaware of any curation at all. It also segregates political, social, economical, and cultural outlooks. They amplify or reinforce existing beliefs and biases by feeding them back to us, causing extremely polarized communities to form. What we see is filtered multiple times over by an artificial intelligence algorithm. We serve a more passive role in this context, as the technology limits our exposure to data that could potentially challenge our views.

Here’s a brief of how they work. A platform like Google collects data on your past behavior. They build an algorithm that prioritizes content fit to your behavior. You further reinforce this because humans have an innate confirmation bias—seeking what we believe to be true. The cycle repeats, each browsing session fine-tuning their model. Google knows you better than you. The impact is massive. Filter bubbles hinder critical thinking and intellectual growth. They reinforce ideological divides and deepen existing misunderstandings between groups. Filter bubbles stifle creativity by limiting the cross-pollination of ideas, and they can be used to spread propaganda to sway elections and policy. That makes this model—that of navigating filter bubbles properly—invaluable.

shallow focus photo of bubble
shallow focus photo of bubble

Real life implications of filter bubbles:

  • Personal growth: an entrepreneur seeking to understand consumer behavior might deliberately search for viewpoints outside their demographic bubble or conduct surveys across diverse groups to minimize bias;

  • Education: teachers should encourage students to use multiple sources for their research, helping them think critically, such as by assigning readings from opposing perspectives;

  • Business: a company could avoid echo chambers by consulting diverse teams when designing a product or service, including individuals from various backgrounds to ensure decisions are not skewed by a single worldview;

  • Policy: governments ought to recognize the impact of filter bubbles on public disclosure and invest in media literacy programs, like initiatives for citizens to identify and challenge filter bubbles when they run into them;

  • Social media: platforms could design algorithms which encourage serendipity, showing users content outside their regular interests, but that’s bad for business.

How you might use filter bubbles as a mental model: (1) understand your bubble, recognizing that nobody is immune to them—evaluate the sources of your information and reflect on how diverse they are; (2) actively seek counter perspectives, deliberately exposing yourself to opposing viewpoints that challenge what you believe; (3) diversify your sources, consuming content from multiple platforms and in different mediums to get a balanced perspective; (4) encourage open dialogue, engaging in respectful discussions with people who think otherwise—use these interactions to test your ideas; (5) leverage anti-bubble tools, like browser extensions, apps, and services to provide unfiltered and/or diverse content.

Thought-provoking insights. “A mind is like a parachute. It doesn’t work if it is not open.” says Frank Zappa, reminding us of the importance to staying open to diverse ideas, even ones we disagree with. “The greatest enemy of knowledge is not ignorance; it is the illusion of knowledge.” is Stephen Hawking highlighting the danger of believing incomplete and/or one-sided narratives being fed to us. “If you only read the books everyone else is reading, you can only think what everyone else is thinking.” Haruki Murakami stressing the importance of seeking diverse perspectives. You have already sharpened your analytical skills by recognizing that filter bubbles exist. Take the next step by challenging your sources. Else you might get and stay stuck in a bubble.