Eli Pariser did a Ted talk recently about how Google and Facebook and so on track just about everything you do online and feed this data into algorithms that personalize the way you experience the Internet. What you see and who you interact with are invisibly decided for you based on your past preferences, and you probably are not even aware of it. Or if you are aware of it, you’re also aware that there’s not much you can do about it.
See, you’re surfing in a “filter bubble” surrounded by people just like you and things you’ve already expressed an interest in, and you’re increasingly cut off from differing viewpoints, unbiased information and new ideas. Here’s a link to the video: it’s worth a watch, if you have nine minutes.
This has been simmering in the back of my brain for a while, along with my thoughts about “Internet Privacy,” which can be summed up as follows: The Internet is an utterly fantastic and magnificent thing. Of course I’m willing to barter a certain amount of my privacy to keep it coming, because I have become completely and utterly dependent on it. I have no desire to go through information withdrawal.
This is because I’m old, and I know what it’s like not to have everything just a mouse click away. I still remember using microfiche and going to the library. When ‘travel agent’ was a legitimate career choice. When “communicating” meant telephones, postage stamps or face-to-face encounters. When a 3 am headache meant just imagining the tumor taking over your brain, not verifying its statistical likelihood. I love my computer. It has opened up the world!
An interactive community has to go both ways, by definition. I don’t mind if everybody knows my birthday, that I’m a registered democrat and I have two children under the age of 20 and I’m married and I have a college degree. In exchange, I’ve re-found old friends and enriched my knowledge base enormously. Go to town on it, I say!
Just don’t interrupt my internet connection! Plus, there are so many simple safeguards you can put in place to protect yourself, like not answering e-mails that have grammatical mistakes in them and checking your credit card statement every month.
But the filter bubble thing brought me to a full stop.
Wait a minute, I’m telling myself. Someone else thinks they’ve got me figured out. Is my world getting smaller, not bigger? Not good. Not good at all.
I cannot abide an algorithm using personal data to pigeonhole my personality and then making assumptions about what I want to see and who I want to listen to, all behind my cyberback. Being in a bubble not of my own making really ticks me off.
I wanted to write about it right away. But I couldn’t because first, I had a five-day headache, and then, when my brain recovered, I couldn’t quite get my mind around it anymore. In fact, the longer I thought about it, the more difficult it got. My filter bubble has gone from black and white to thoroughly grayscale.
First, it’s obvious that filtering has to happen, because the Internet is a gargantuan monster. There is simply too much information. It has to be sifted somehow; an algorithm has to be involved at some level. Pariser is playing off our fear that computers are going to take over the world. (Which they will, I promise, but I’ll tell you about that in another post). No matter how you boil it down, at some point a human will be involved in setting up the filtering rules. Since I’m a geek, I want it to be me. But unlike the average geek, I don’t write code, so this is a problem. Even true geeks haven’t cracked this, yet. (But I’m sure they will. I’d like to be notified.)
Second, Pariser has obviously never worked at a newspaper. He has a very romanticized view of editors as information curators. Any journalist knows that editors are evil beings from alien planets whose sole desire is to quash original thought and wonderful writing, while simultaneously removing any truly controversial content that will in any way jeopardize sales or ad revenues or lead to libel cases. News Flash, Eli! Editors don’t make content decisions based on what they think you need to read. They make content decisions on what they think will sell the paper. As filter gurus, they’re not much better than zombie algorithms.
That brings up the whole meal analogy. I don’t want anyone telling me to eat my spinach in reality, so why would I tolerate it metaphorically? The idea of some “ubereditor” out there deciding what my intellectual spinach is and then determinedly feeding it to me is just as repellent to me as my cyberexperience being decided by a market-based algorithm. Pass the dessert and keep it zipped, Eli, and nobody will get hurt.
Then there’s “relevance.” Everyone hauls out this quote from Mark Zuckerberg as “chilling” evidence of the nefariousness of Facebook’s hold on our psyches:
“A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.”
We’re being dumbed down to the level of backyard rodents! But look at it from a Zen perspective. Is it really that bad to inhabit our immediate present? (Maybe the squirrel is carrying the West Nile virus!) In short, who’s to decide what’s “relevant” and what isn’t? Is death in Africa more relevant, say, than rape in downtown Milwaukee? There are a million issues in the world, all of which deserve our attention. Some of us are better at this than others. Some of us can eat a whole cup of spinach while others reach their limit after just a few tablespoons. It’s simply not one-size-fits-all.
In fact, these algorithms just underline what we all know already, don’t they? People like to hang out with other people who think like they do. That’s why we go to church and PTA meetings, join soccer teams and orchestras. We stereotype ourselves and other people all the time, and willingly. It’s not necessarily a good thing, but for most people, it makes life easier and it’s what they choose to do. Ambiguity is uncomfortable, particularly if you’re really attached to your way of looking at things.
And that gets right to the root of the problem: What I do think filter bubbles have done is lower the general level of discourse within and between non-intersecting bubbles. When you know that everyone you’re talking to shares your opinion, you can lambast the other guys without a qualm. Nobody calls you to task for it. You don’t have to be polite. On the Internet, everyone’s a journalist without an editor. And to make it worse, they’re all clamoring for attention. The more outrageous, rude, and polarized their statements, the more traffic they get and the more legitimate they feel. The absence of outrage in their listeners and readers is interpreted as carte blanche to carry on. And those who guilelessly wander in and get an earful learn very quickly not to stay and debate. Like a cheeto on an anthill, they’re quickly torn to shreds. I think that this has had a truly noxious effect on the level of civil discourse in the US, online and off. You don’t have to worry about getting along with your neighbors anymore, because you can go and hang out online with people who think just like you. There is no more room for differences of opinion or civil debate, because the doors are shut and the algorithms are holding the keys.
See why I’m stuck? I see why the filter bubbles are there, and it kind of makes sense. There is not really a good alternative at the moment. But I also see what it’s doing to us, and I am deeply chagrined.
I think the only real solution is to hand the problem over to the geeks, and get them to show us how to take back our control of how we see the world. We need to pop our bubbles and set up filter sieves, where we retain our own right to decide how big the holes are and where they’re located. In the meantime, I did find one site that has a few practical tips on how to limit the extent of your filter bubble.
Any other tips are most welcome.