icon-email icon-facebook icon-linkedin icon-print icon-rss icon-search icon-stumbleupon icon-twitter icon-arrow-right icon-email icon-facebook icon-linkedin icon-print icon-rss icon-search icon-stumbleupon icon-twitter icon-arrow-right icon-user Skip to content

Eli Pariser first noticed the phenomenon that he calls the “Filter Bubble” when his conservative “friends” began to disappear from his Facebook page. Now you might think that for Pariser, co-founder of the unabashedly liberal website MoveOn.org, this was good news. Not so, he asserts in his book titled, not surprisingly, The Filter Bubble. Pariser is apparently a throwback to the days of the founding fathers when savvy politicians believed in keeping their friends close and their enemies closer; when you learned by studying the perspective of your adversary. So, he wanted to find out why the Tea Party had left his Facebook party. Not their choice, Pariser came to discover — they had been filter bubbled out.

What Pariser learned in the course of researching his book was that Facebook had shown his conservative friends the door because Facebook, or rather Facebook’s filtering algorithm had decided, based on the fact that Pariser did click on his Tea Party buddies less often than his MoveOn cronies, that he really wasn’t all that interested in them. So why clutter his page with them? “Off with their heads!”

I advise you to read the book. It is interesting, and more than a little chilling. Basically, here’s the Cliff’s Notes version of what is going on: Most of THE INTERNET makes its money from advertising. The ubiquitous little ads that pop-up on the pages you go to while online. The hosting page — be it Google or Lands End, the Gap, Spotify, whoever — gets a bigger piece of the ad revenue if you actually click on an ad. Hence the more the page "knows" about you the more it can push — in split seconds — ads onto the page that are tailored for your very personal profile. Try this — do a Google search for Labrador retrievers, play around on dog pages for a while. Now go to some other site — like Amazon or Yahoo news. Look at the ads. Seem a little more “doggy” than usual? I told you it was a touch creepy. There are very large, very wealthy companies that do nothing but gather our “click streams” and sell them to the algorithm-makers.

You can actually understand it from a business perspective. Advertisers are simply trying to place their products in front of people whose own Internet behavior indicates that they are interested in the product. Seems harmless until we remember the case of the vanishing conservatives. THE INTERNET isn’t simply tracking and constructing filters based on the products we like, it is also building filters that keep out the ideas that we don’t like, while foregrounding our proclivities. Internet algorithms try to construct, and lead us to, our vision of “a perfect world.” You know the saying — someone asks you a question and you respond, “Well, in a perfect world...” What we mean is in our perfect world, the world as we would like it to be.

In the movie Heaven Can Wait — the 1978 Warren Beatty version — the welcoming angel tells Beatty’s character that heaven is “a product of your image and that of those who share your image,” a perfect world, defined by what we, and our “friends” believe a perfect world should be. That is a very prescient “internet-algorithm-esque” concept for a 1978 chick-flick!

There are, however, problems inherent in letting Internet algorithms define a perfect world for us, based upon their perception of our behavior. I am reminded of the elementary schoolyard where I played as a child. It was, by contemporary standards, a death trap. Asphalt paving everywhere except on the fields where we played baseball and football. Those were dirt, not grass, dirt. The slides were really tall — you sort of had to lean back to see the top. They were shiny steel with four-inch sides. Sliding down on summer days was a delicate balance. The heat seemed to increase your speed, but if you were too light to get all the way off the end, you stuck. First degree burns on your butt.  So you leapt off to the easier embrace of the landing area, which was, remember, asphalt. Similarly, sliding in a baseball game was a decision to which one did not come lightly. You measured the transient heroism of victory against the possibility of major abrasions and a tetanus-shot trip to the nurse’s office. All in all we had a good time.

My daughters grew up as playgrounds were transitioning into “a perfect world.” Everything is now low and slow, plastic and padded. No doubt injuries still occur, though it seems you would probably have to put some planning and effort into it. According to the TV ads, successful injuries are dealt with by a phalanx of perfect moms welding spray-on antiseptic and instant bandages.

The point is this — sometimes the world that is constructed by others for our “own good” damages the depth of our experience and compromises the legitimacy of our conclusions. Internet filters that show us only that which we already believe and desire, destroy the opportunity for the serendipitous discovery that comes from going somewhere we have never been before. They deny us the opportunity to learn from those who think differently than we. They create a perfect world in which everything seems low and slow, plastic and padded.

But wait! There is a software fix for this world in a bubble that might even increase Internet profits. Listen up, moguls. Photoshop has a feature that lets you select parts of an image; either parts that you click on, or parts that share a color. Point is that it lets you select part of an image based on certain criteria. Once you have selected those parts of the image you can go to the “Selection” menu, where among the options is: Select Inverse - which means "select all those elements that I have not chosen."

You see where I am going here? If the algorithm can decide what it thinks I want, can’t it also decide what I don’t want? Wouldn't it be cool if I could tell Google to “Select Inverse?” Create a search based on the notion that what I haven’t experienced might be more intriguing than what I have already done? Think about it as a clock face. You are standing in the middle and facing 12 o’clock. Noon is “a perfect world.” Midnight is what the algorithm predicts you want it to find. Six o’clock is “Select Inverse.” Why can’t I ask for that “6 o’clock search” instead?

And let’s not forget the numbers in between. Photoshop also has a slider attached to many of its functions called opacity or intensity. Essentially, it a function rheostat. You move from, say, 100 percent opacity, where you cannot see through an image at all, to 0 percent opacity where the image disappears and only the background is visible. Why not a “Search Slider” that lets you move the algorithm around. Say 1 o’clock equals a search with your characteristics intensified 30 percent, 2 o’clock is “you” intensified 60 percent. And 11 o’clock reflects a search with your characteristics deflected by 30 percent, 10 o’clock and “you” are deflected 60 percent. Fader Bar Search. Why not?

There are obvious and intriguing existential implications in the fact that moving in both a “positive” and a “negative” direction will eventually bring us to the same 6 o’clock “Select Inverse” world that stands in algorithmic opposition to the perfect world that the Filter Bubble seeks to create for us. But, in the final analysis, shouldn’t we be allowed to choose the direction and intensity of the journey? Isn’t that really what “searching” means?
 

Robert Schrag

Robert Schrag has been a communication professor for over 40 years. He is also a painter, sculptor, husband, and father of two.

Learn More

Latest Stories

Choosing Senior Living

Stay Up to Date

Sign up for articles by Robert Schrag and other Senior Correspondents.

Latest Stories

Choosing Senior Living
Love Old Journalists

Our Mission

To amplify the voices of older adults for the good of society

Learn More

News & Opinion from Senior Correspondents Across the Globe