Have you ever logged-in to Amazon.com and looked through a list of recommendations?
“A Savage Garden CD…Pleather Moon Boots…a DVD copy of Leprechaun: In The Hood (starring Ice-T)!? It’s like they know me!
Well, that’s because they do know you to some extent; websites like Amazon or Facebook create algorithms based on user-preferences and behavior to determine what content you see in hopes of improving your user-experience, holding your interest and, if possible, getting you to buy something. This is all the result of a growing trend on the web in recent years known as “personalization.” While personalizing the web for each person may seem like a logical step towards making the web more user-friendly, there are a surprising number of dissenting voices.
I read an interesting book a few months ago by one of them called The Filter Bubble: What the Internet Is Hiding From You. The book’s author (Eli Pariser) is one of many critics of the personalizing trend. These critics worry that altering or filtering content based on a machine’s idea of our identity will only keep us in a bubble with other like-minded people who see only the news and information they “want” to see. Pariser’s argument has a definite political bent, as he feels that personalization is inimical to Democratic ideals, a Balkanization of the web that “filters” access to challenging viewpoints and new ideas. (For example, if Amazon didn’t personalize recommendations, I might be a little more open-minded and able to break out of my pleather moonbooted cocoon.)
While opponents of the “personalization” trend have put forth intelligent critiques, there has recently been some push-back. An article by Farhad Manjoo over at Slate reports on a new study conducted by Eytan Bakshy, an Information theorist at the University of Michigan, which seems to indicate that the web is not so much of an echo-chamber as critics like Pariser may think. Bakshy’s experiment looked at two different groups on Facebook and the ways they shared links with their ‘friends.’
Facebook runs an algorithm called EdgeRank that calculates which friend’s links show up in your feed – if it’s a close friend with whom you communicate a lot with on the site, for instance, you’ll see their links before you see any links shared by a second cousin whom you friended 3 years ago. Bakshy used an algorithm of his own to randomly censor which links each group saw: one group would see links shared by their friends (and could thus choose to share them with their own group of Facebook friends), and the other group would not see those links (but could very well find them on their own and choose to share the links themselves).
Bakshy’s experiment found two important things: firstly, the closer you are with someone on Facebook, the more you’ll share their links. That seems obvious enough, and seems to confirm Pariser’s echo-chamber thesis. The other finding, however, is equally simple but also surprising: we still share a bunch of links we get from weak ties. These “weak” links come from people outside your bubble of like-minded friends who are likely visiting websites you don’t look at on a daily basis. These links introduce “novel” information into the system, and considering the fact that most of your Facebook friends are weak ties, the social network experience may not be as sheltered as you think (we’re still seeing a lot of links posted by those weird distant cousins, and sharing them too).
The debate is by no means over, of course. It’s clear that it deserves further investigation and study. We here at Spokeo People Search have a vested interest in this topic, as we are all about bringing people together and providing equal access to information. No matter where you stand on the issue, I think we can all agree that the Internet should be a place to open your mind and expand your knowledge.