Eli Pariser: Beware Online 'Filter Bubbles'

 

http://www.huffingtonpost.com/2011/12/14/internet-censorship_n_1147078.html?1323877374

 

In this special year-end collaboration, TED and The Huffington Post

are excited to count down 18 great ideas of 2011, featuring the full

TEDTalk with original blog posts that we think will shape 2012. Watch,

engage and share these groundbreaking ideas as they are unveiled

one-by-one, including never-seen-before TEDTalk premieres. Standby,

the countdown is underway!

Watch author Eli Pariser discuss secret censorship on the Internet,

and read is accompanying blog post below following up on his talk.

 

People love sharing lists -- the list is one of the formats that fare

well in a Filter Bubble world. So here's a list of five of the most

interesting ideas I've come across since I gave my presentation at TED

and published The Filter Bubble: What The Internet is Hiding from You.

 

1. Who owns the right to infer things about you? According to Marissa

Meyer at Google, some credit card companies can now use your

purchasing decisions to predict whether you're going to get a divorce

with 95% accuracy -- two years out. This raises some interesting

ethical questions: do companies have an obligation to reveal to us the

inferences they make about us? Should you be able to gain access to

the fact that your credit card company is betting against your

relationship? What about in the health sphere -- if Acxiom infers that

you're at high risk of suicide, based on your purchases, does it have

an obligation to let you or someone else know? I haven't found any

satisfying answers to these questions -- but we ought to start

thinking about them more seriously.

 

2. Transparency's moving in the wrong direction. Imagine a company

where every communication is transparently available to every

employee. While corruption at the top is harder, overall, the effect

would be to empower the bigwigs -- because the kind of private

coordination that people use to organize and aggregate power would be

impossible. Speak ill of the boss, and you get laid off -- that's how

power works.

In an ideal world, I'd argue, transparency would vary with power --

the more powerful you are, the brighter the spotlight on your

activities. But what we're seeing now is the opposite: the details of

most folks' lives have never been more available to more corporate,

governmental or even private citizens. But thanks to the Citizens

United Supreme Court decision, the wealthy and powerful are able to

cloak their political activities, and there are a variety of services

available to scrub private information from the web for a price. We

have transparency for the 99%, but not the 1%.

 

3. Robot journalism. Mostly, I've been focused on the impact of

code-based editors on how we consume news. But it's worth noting that

drone-like mini-robots are beginning to do some real news gathering as

well. Check out this footage from a tiny helicopter piloted by folks

at The Daily, or this stunning video from a protest in Poland. It

won't be long before every news bureau -- and more than a few amateurs

-- are using these things to push past military lines, look in

celebrities' windows and generally change all of our assumptions about

how video news is gathered -- for better or worse.

 

4. The difference between curiosity and value. Recently, The

Huffington Post tweeted about an article with the headline to the

effect of "Guess Which Celebrity Got Into a Horrible Accident Today?"

I'll cop to clicking -- HuffPost did an excellent job piquing my

interest. But I couldn't tell you which celebrity it was, because I've

forgotten -- there was very little lasting value in that article.

These kinds of curiosity-driven clicks are one of the primary signals

that sites use to personalize content. But unless they're paired with

something that measures the amount of value we take away from a media

experience, they're only so useful. And they lead toward a world with

curiosity-baiting headlines and no payoff.

What if, in addition to click signals, personalizing websites also

sent folks a list of the 50 articles they'd recently visited and asked

them to mark the three that gave them some lasting value? A

personalized feed that took into consideration not just what we click

on but what we take away from it could help us build information diets

that are both delicious and substantive.

 

5. Seven Things Algorithms Do That Humans Don't. As we move toward an

algorithmically-edited world, there are still a bunch of things that

human editors do better. This Harvard Business Review piece has a bit

more detail, but here's the short list: Anticipating what people will

be interested in, taking risks in recommendations, giving folks a

sense of the whole picture, pairing stories together in a way that

adds value, highlighting stories of social importance, valuing content

that blows folks' minds and building the kind of trust that leads

audiences to topics beyond their core zone of interest.

 

Oh, and one more thing: As I've been discussing The Filter Bubble, the

aspect of the problem I've become most focused on is the Information

Junk Food problem. In many ways, the important question isn't just

whether you see a diverse set of political viewpoints, but whether

most people see anything from the political or civic realm at all. I'm

working on a new media project aimed at getting ideas that matter in

front of millions of people -- if that sounds like fun to you, maybe

you should come work with us.