What We Want to See Versus What We Need to See Will the Internet continue to be beneficial to democracy and freedom? Eli Parser raises concerns that the information we are receiving via the Internet is filtered by providers such as Google search and Facebook. Facebook edits the friends' posts you see by the pattern of interests you have shown, what you have clicked on. Pariser calls this an "invisible algorithmic editing of the Web". Google reviews 57 signals about you, including your location, the type of computer you use, and the Internet browser you use, to personally tailor the search results you receive, i.e. customized query results. Therefore, everyone sees their own individual results. Pariser gives an example of his and friends' different Google results for the search query "Egypt", illustrating the vastly different results each received. It's not just Facebook and Google, this is pervasive across the Internet, including Yahoo News, Huffington Post, Washington Post, New York Times, et. al. are all provide some degree of personalization and therefore filtering. Pariser notes, "This results in an Internet showing us what it thinks we want to see, but not necessarily what we need to see".
Filter Bubbles and Your Information Universe Eli Pariser continues quoting Eric Schmidt, Executive Chairman of Google, "It will be very hard for people to watch or consumer something that has not in some sense been tailored for them". Pariser says a filter bubble is all these algorithms, personalizations, and customizations combined to tailor the individual Internet experience. This is "your own personal, unique Universe of information that you live in online". The problem is that you don't decide what is in your filter bubble, the providers and their algorithms do. "More importantly, you don't actually see what gets edited out."
Important Information Versus Irrelevant Information Your future aspirational self versus more impulsive present self determines what you see on the Web. These algorithmic filters weight what you click on first, therefore your impulsive, perhaps less relevant, clicks are more important in designing your Information Universe. Therefore, Pariser says this "information diet" of yours can be skewed to "information junk food".
Myth of Internet Freedom of Information? In our prior broadcast world, there were gatekeepers, editors who controlled the flow of information, who determined what see saw on TV and what we heard on the radio. The emergence of the Internet removed these gatekeepers, at least initially. Eli Pariser maintains, "that is not actually what is happening right now". "What we are seeing is a passing of the torch, from human gatekeepers to algorithmic ones." Further, these algorithmic gatekeepers do not have "embedded ethics" while they "curate the world for us" and decide what we get to see and what we do not get to see on the Internet.
Filter Bubble Parameters and the Web of One These algorithmic gatekeepers are determining what we see, our Information Universe, by sorting by relevance. Eli Pariser asserts that additional parameters, filters should be included, such as important information, uncomfortable information, challenging information, and other points of view. He also notes that a good functioning democracy needs a good flow of information. Pariser gives the example of newspapers circa 1915. Are we back in 1915 on the Web? Do we need to break the information logjam again? Pariser thinks so. The new algorithmic gatekeepers need to add more parameters and open the flow of information to include a sense of "public life" and "civic responsibility". Further, these algorithmic gatekeepers and what is being filtered should be transparent. Ultimately, we need control of the algorithmic gatekeepers to decide "what gets through and what doesn't", what is in our Information Universe on the Internet. The Internet needs to introduce, connect, add perspective to people and ideas for us. The algorithmic gatekeepers are creating a Web of One.
Filter Bubbles and Your Information Universe Eli Pariser continues quoting Eric Schmidt, Executive Chairman of Google, "It will be very hard for people to watch or consumer something that has not in some sense been tailored for them". Pariser says a filter bubble is all these algorithms, personalizations, and customizations combined to tailor the individual Internet experience. This is "your own personal, unique Universe of information that you live in online". The problem is that you don't decide what is in your filter bubble, the providers and their algorithms do. "More importantly, you don't actually see what gets edited out."
Important Information Versus Irrelevant Information Your future aspirational self versus more impulsive present self determines what you see on the Web. These algorithmic filters weight what you click on first, therefore your impulsive, perhaps less relevant, clicks are more important in designing your Information Universe. Therefore, Pariser says this "information diet" of yours can be skewed to "information junk food".
Myth of Internet Freedom of Information? In our prior broadcast world, there were gatekeepers, editors who controlled the flow of information, who determined what see saw on TV and what we heard on the radio. The emergence of the Internet removed these gatekeepers, at least initially. Eli Pariser maintains, "that is not actually what is happening right now". "What we are seeing is a passing of the torch, from human gatekeepers to algorithmic ones." Further, these algorithmic gatekeepers do not have "embedded ethics" while they "curate the world for us" and decide what we get to see and what we do not get to see on the Internet.
Filter Bubble Parameters and the Web of One These algorithmic gatekeepers are determining what we see, our Information Universe, by sorting by relevance. Eli Pariser asserts that additional parameters, filters should be included, such as important information, uncomfortable information, challenging information, and other points of view. He also notes that a good functioning democracy needs a good flow of information. Pariser gives the example of newspapers circa 1915. Are we back in 1915 on the Web? Do we need to break the information logjam again? Pariser thinks so. The new algorithmic gatekeepers need to add more parameters and open the flow of information to include a sense of "public life" and "civic responsibility". Further, these algorithmic gatekeepers and what is being filtered should be transparent. Ultimately, we need control of the algorithmic gatekeepers to decide "what gets through and what doesn't", what is in our Information Universe on the Internet. The Internet needs to introduce, connect, add perspective to people and ideas for us. The algorithmic gatekeepers are creating a Web of One.
Beware Online "Filter Bubbles" As web companies strive to tailor their services (including news and search results) to our personal tastes, there's a dangerous unintended consequence: We get trapped in a "filter bubble" and don't get exposed to information that could challenge or broaden our world-view. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy.
Shortly after the September 11, 2001, attacks, Eli Pariser created a website calling for a multilateral approach to fighting terrorism. In the following weeks, over half a million people from 192 countries signed on, and Pariser rather unexpectedly became an online organizer. The website merged with MoveOn.org in November 2001, and Pariser -- then 20 years old -- joined the group to direct its foreign policy campaigns. He led what the New York Times Magazine called the "mainstream arm of the peace movement" -- tripling MoveOn's member base and demonstrating how large numbers of small donations could be mobilized through online engagement.
In 2004, Pariser became executive director of MoveOn. Under his leadership, MoveOn.org Political Action has grown to 5 million members and raised over $120 million from millions of small donors to support advocacy campaigns and political candidates. Pariser focused MoveOn on online-to-offline organizing, developing phone-banking tools and precinct programs in 2004 and 2006 that laid the groundwork for Barack Obama's extraordinary web-powered campaign. In 2008, Pariser transitioned the Executive Director role at MoveOn to Justin Ruben and became President of MoveOn’s board; he's now a senior fellow at the Roosevelt Institute.
His book The Filter Bubble is set for release May 12, 2011. In it, he asks how modern search tools -- the filter by which many of see the wider world -- are getting better and better and screening the wider world from us, by returning only the search results it "thinks" we want to see.
Visit Osprey Port News Network!
Apple, Google, Baidu, China, technology, financial system, stocks, markets, economy, science, environment, future
Follow Mountain Vision (@MountainVision) on Twitter!
Observations & thoughts by a sojourner through space & time...
Technological singularity, transhumanism, reality (objective, virtual, programmed, augmented), Universe, future.
▲▲▲