The Trouble With the Echo Chamber Online
By NATASHA SINGER
Published: May 28, 2011
ON the Web, we often see what we like, and like what we see. Whether we know it or not, the Internet creates personalized e-comfort zones for each one of us.
Give a thumbs up to a movie on Netflix or a thumbs down to a song on Pandora, de-friend a bore onFacebook or search for just about anything on Google: all of these actions feed into algorithms that then try to predict what we want or don’t want online.
And what’s wrong with that?
Plenty, according to Eli Pariser, the author of “The Filter Bubble: What the Internet Is Hiding From You.” Personalization on the Web, he says, is becoming so pervasive that we may not even know what we’re missing: the views and voices that challenge our own thinking.
“People love the idea of having their feelings affirmed,” Mr. Pariser told me earlier this month. “If you can provide that warm, comfortable sense without tipping your hand that your algorithm is pandering to people, then all the better.”
Mr. Pariser, the board president of the progressive advocacy groupMoveOn.org, recounted a recent experience he had on Facebook. He went out of his way to “friend” people with conservative politics. When he didn’t click on their updates as often as those of his like-minded contacts, he says, the system dropped the outliers from his news feed.
Personalization, he argues, channels people into feedback loops, or “filter bubbles,” of their own predilections.
Facebook did not respond to e-mails seeking comment.
In an ideal world, the Web would be a great equalizer, opening up the same unlimited vistas to everyone. Personalization is supposed to streamline discovery on an individual level.
It’s certainly convenient.
If you type “bank” into Google, the search engine recognizes your general location, sending results like “Bank of America” to users in the United States or “Bank of Canada” to those north of the border. If you choose to share more data, by logging into Gmail and enabling a function called Web history, Google records the sites you visit and the links you click. Now if you search for “apple,” it learns and remembers whether you are looking for an iPad or a Cox’s Orange Pippin.
If you’re a foodie, says Jake Hubert, a Google spokesman, “over time, you’ll see more results for apple the fruit not for Apple the computer, and that’s based on your Web history.”
The same idea applies at Netflix. As customers stream movies, the recommendation system not only records whether those viewers generally enjoy comedies but also can fine-tune suggestions to slapstick or more cerebral humor, says John Ciancutti, the company’s vice president for personalization technology.
But, in a effort to single out users for tailored recommendations or advertisements, personalization tends to sort people into categories that may limit their options. It is a system that cocoons users, diminishing the kind of exposure to opposing viewpoints necessary for a healthy democracy, says Jaron Lanier, a computer scientist and the author of “You Are Not a Gadget.”
“People tend to get into this echo chamber where more and more of what they see conforms to the idea of who some software thinks they are — like a Nascar dad who likes samurai swords,” Mr. Lanier says. “You start to become more and more like the image of you because that is what you are seeing.”
Mr. Lanier, who is currently doing research at a Microsoft lab, emphasized that his comments were his own personal opinions.
If you want to test your own views on personalization, you could try a party trick Mr. Pariser demonstrated earlier this year during a talk at the TED conference: ask some friends to simultaneously search Google for a controversial term like gun control or abortion. Then compare results.
“It’s totally creepy if you think about it,” said Tze Chun, a filmmakerwho agreed to participate in a similar experiment at a recent dinner party we both attended in Brooklyn. Five of us used our phones to search for “Is Osama really dead?,” a phrase Mr. Chun suggested.
Although our top 10 results included the same link — to Yahoo Canada answers — in first place, two of us also received a link to a post on jewishjournal.com, a newspaper site. Meanwhile, Mr. Chun and two other filmmakers had links to more conspiratorial sites likedeadbodies.info.
For Mr. Chun, who visits a variety of true-crime Web sites as part of his screenplay research but tends to favor sites that sell vintage T-shirts in his private life, the personalization felt a little too, well, personal.
“You are used to looking at the Internet voyeuristically,” he said. “It’s weird to have the Internet looking back at you and saying, ‘Yeah, I remember things about what you have done’ and gearing the searches to those sites.”
With television, people can limit their exposure to dissenting opinions simply by flipping the channel, to, say, Fox from MSNBC. And, of course, viewers are aware they’re actively choosing shows. The concern with personalization algorithms is that many consumers don’t understand, or may not even be aware of, the filtering methodology.
Personalized Web services, Mr. Pariser says, could do more to show users a wider-angle view of the world.
But some of the most popular sites say they have already built diversity into their personalization platforms.
“People value getting information from a wide variety of perspectives, so we have algorithms in place designed specifically to limit personalization and promote variety in the results page,” said Mr. Hubert, the Google spokesman. He added that the company looked forward to “carefully reviewing Mr. Pariser’s analysis of this important issue.”
At Netflix, the system recommends a mix of titles, some with high confidence of viewer enjoyment and others about which it is less sure, Mr. Ciancutti says. Netflix’s flat monthly rate for unlimited streaming, he adds, encourages people to select films, like documentaries, that they might not have chosen otherwise.
INDIVIDUAL users could also do their part.
Mr. Pariser suggests people sign up for a range of feeds on Twitter, where the posts are unfiltered. Mr. Lanier suggests Tea Party members swap laptops for a day with progressives and observe the different results that turn up on one another’s search engines.
If we don’t chip away at the insulation of consensus, they caution, the promise of the World Wide Web could give way to a netherworld of narcissism Net.