“Information, knowledge and culture are central to human freedom. How they are produced and exchanged in our society critically affects the way we see the state of the world.” (1). In our electronic world, a huge amount of our information comes from the Internet, and the production of that information is a complex thing, which requires, from the consumer a critical approach. Benkler is very optimistic about the way we use the public Internet space to inform our reading of this information. This shared public space allows for many different voices to be heard, and Benkler claims that because so much information is available, there emerges ” a more critical and self-reflective culture (Benkler, pg 15) Whether this optimism is justified or not is another matter.
Recent issues with Facebook over trending topics and the its algorithms that monitor site content both support and challenge the idea of the user’s critical view of information from Facebook. How important is Facebook’s influence. The NY times says Facebook has 1.71 billion members worldwide, and half of American adults get their news from Facebook.(2) This is a new narrowing of a news source. Certainly no single newspaper or single TV news show is relied upon by half the adult population for news. That many people relying on Facebook as their single source for information means everything that Facebook does matters.
Kincheloe talks about electronic media as providing us with a “secondhand culture, filtered and preformed in the marketplace and constantly communicated via cultural and mass media.(3) This is what Facebook is doing in a very concentrated way, feeding us our culture and information, “filtered” as it chooses. Giving us a filtered monoculture.
In May 2016, The Wall Street Journal wrote about bias in trending topics posted by Facebook. They claimed that conservative news is down played, and liberal news stories are chosen and emphasized. Facebook replied that these stories are chosen through algorithms, and so are neutral, and that guidelines are “in place to support consistency and neutrality. (4)
While Facebook claimed neutrality, nonetheless it responded to accusations of bias by changing its in house training program. In June 2016, Facebook included political bias in its standard training sessions, beyond racial, gender, etc bias.(5)
In September Facebook removed a 1972 Vietnam War photograph (that had a depiction of a naked child) from its site. However member response, (thousands of people reposted the picture) was such that Facebook reinstated the picture on its site. In both these instances Facebook responded to user concerns.
What we see here is a constant back and forth between members and Facebook, in a fight over the control of the content. Members are not passive recipients of filtered information. They do question what they are seeing and reading. This supports Benkler’s idea that “individuals are less susceptible to manipulation by a legally defined class of others- the owners of communications infrastructure and media” (Benkler pg 9) and that there is not control that is gained once, but “hegemonic consent is always in flux.” (Kincheloe, pg. 93.)
However, these two examples, of member push back, what do they really mean? We have two examples of Facebook changing its content because of outside input. Are these two examples exceptions, instances where the members are critical and aware that they are being manipulated? Is the norm the opposite? Is it true that most of the time Facebook’s readers are not critical? The bias in trending topics was reported by a former employee (a whistle blower) not by a user. Only once that it was exposed did members push back. Would even the most critical of readers have noticed a bias? Or would the members be more likely to think that just because it was on Facebook it was authoritative? To draw a parallel, “The information encapsulated in an article stands alone, authoritative by virtue only of its presence in the volume. Legitimacy is conferred by its place on the Library shelf, (6), or to paraphrase, news on Facebook is legitimized by being there. How many users will be able to take the critical stance described by Kincheloe: “critical hermeneutics traces the ways the cultural dynamics (of popular media) position audiences politically in ways that not only shape their political beliefs, but formulate their identities.” pg 103.? How able are we to “trace” the way Facebook “positions” us? Do we have the knowledge to do so. It does not seem that the open common space of the internet, as described by Benkler, is routinely able to give us a “more critical and self-reflective culture.” Perhaps we can only hope that whistle blowers (Snowden) will continue to come along, as we, users might not be able to maintain a critical stance that is educated and robust enough to unearth these manipulations.
(1) http://www.benkler.org/Benkler_Wealth_Of_Networks_Chapter_1.pdf, p.1
(2) www.nytimes.com/…/facebook-vietnam-war-photo-nudity.html
(3) https://www.researchgate.net/publication/261773451_ Rethinking_Critical_Theory_and_Qualitative_Research
(4) http://www.wsj.com/articles/five-things-to-know-about-facebooks-trending-controversy-1462915385
(5) http://www.ibtimes.com/facebook-introduces-political-bias-training-after-trending-topics-controversy-2385911
(6)”http://www.jstor.org/stable/4309685 pg 435