An Open Source Cure for Cancer?: Paticipatory Medicine in the Digital Realm

Diagnosed with brain cancer in September 2012, Italian designer, computer engineer, and hacker Salvatore Iaconesi decided to publish his personal medical data online, inviting the public to respond. The result was the creation of an online community of cancer survivors, family members, doctors, and other allies who came together to support Iaconesi (and each other) in treatment. La Cura therefore came to embody a collaborative social space, uniting treatment and healing outside of the hospital setting.

The project was motivated by Iaconesi’s experience as a patient—an experience he has described as dehumanizing and altogether isolating. According to Iaconesi, after diagnosis, he felt his individual identity disappear; in becoming a patient, his complexity as a human being was ignored to the point where he became a set of data for doctors to analyze and manipulate. Personal identity had been taken out of the equation, and data (rather than the human condition) came to determine his treatment.

La Cura essentially turned this dynamic on its head, appropriating Salvatore’s medical data and breaking the patient paradigm. The data was published online where anyone could access, interact with, or respond to it, thereby producing their own creative meaning. No longer a medical commodity, “the cure” became a social relationship born out of the interactions of the public. Salvatore’s project developed into an important global performance where personal expression marked a social consciousness. Cancer, as a social performance, is perhaps more suited to such treatments.

Screen Shot 2015-12-10 at 12.01.39 AM

Brain scans from Iaconesi’s medical record.

In the hospital setting, data associated with the body is expressed in language particular to doctors and other medical practitioners. Iaconesi was frustrated to find that his digital medical records were in a closed, proprietary format that he could not even open on his own computer. Feeling that his humanity had been replaced by restricted clinical records, he began with translating electronic medical records into “personal open data”.  As he explains on his website, he “cracked them[…,]opened them and converted the contents into open formats, so that [he] could share them with everyone.” (Iaconesi). He began by sending the data to doctors, and publishing their responses using open formats so others with his condition could benefit from the information. He compelled visitors to his website to “grab the information about my disease[…]and give me a CURE: create a video, an artwork, a map, a text, a poem, a game, or try to find a solution for my health problem.” (Iaconesi).  Obviously these formats are not one would traditionally associate with medical “cures,” but they serve to reframe the notion of “the cure” in as a more socially and community-based creation. At the end of this experiment, Iaconesi received, 35 videos, 600 poems, 15,000 testimonies, 500 reviews of doctors, and over 50,000 different strategies to cure his cancer.

Screen Shot 2015-12-09 at 11.56.44 PM

Portion of relational graph of submitted “cures”. (Full version can be found on La Cura website)

Resistance to these types of programs is often heard from healthcare practitioners and medical institutions that see the digital sphere as potentially misleading and unreliable. While these concerns are certainly founded, the potential for positive change in healthcare through such initiatives cannot be overlooked: digital spaces of interaction such as the internet allow for the public negotiation of power and an active sort of social creation that can be found nowhere else.

In open source, “a common problem is placed in a common space, and people from around the world turn themselves to working, in parallel, on this problem.” (Lessig, 107) This, in effect, turns the structure of governance on its head, as the control of information is central to power. Through La Cura, Iaconesi explores how networked communications are used to to empower both patients and patient communities. The internet both pluralizes flows of information while simultaneously widening the scope of commentary.

Though not quite what Larry Diamond has envisioned, parallels can be drawn between Iaconesi’s project and “liberation technology”. According to Diamond, liberation technology “is any form of information and communication technology (ICT) that can expand political, social, and economic freedom.” (Diamond, 70). Aiming to humanize and health care through the use of participatory media and digital communication, La Cura circumvents the traditional power structures of information—here, the ones that structure medical knowledge. Health care has been recast in a decidedly social way, shifting not only the dynamics of the patient/practitioner relationship but also the overall approach to disease treatment. Actively performing their own healing process with the use of digital media, users are empowered by such a reimagined system, constructing a reflexive space of creative healing.

More democratic healing practices target all spheres of a patient’s life and wellbeing. The environment of such living labs are co-developed with users—not just for them—to transform the healthcare center into a social hub where a diverse array of individuals’ distinctive needs initiate creation, rather than normalization. As in La Cura, active collaboration and performance in a social setting initiates a unique thinking process that targets the healing of the individual human being, rather than isolated treatment of a medically-defined disease. This expansion of participatory medicine into the digital realm allows for a more inclusive and interactive form of whole person care. As health becomes regarded as more than just the absence of pain and suffering, the dynamics of illness and disease must be viewed within this expanded framework of social, mental, and community health.

References

Diamond, L. (2010). “Liberation Technology,” Journal of Democracy 21(3): 69–83.

Iaconesi, S. (2012). La Cura, an Open Source Cure. http://opensourcecureforcancer.com/

Lessig, L. (1999). “Open Code and Open Societies: Values of Internet Governance,” Chicago-Kent Law Review 74, 101–116. http://cyber.law.harvard.edu/works/lessig/ final.PDF.

A Political Campaign in 140 Characters

Since the development of various technologies and progression of the digital age, the electoral process has dramatically changed since 1789 when George Washington was elected. The presidential candidates for the 2016 election are fighting a battle that hasn’t been fought before.

The Past – 1789 to 2000s.

In early America, presidents such as George Washington and James Monroe traveled by horseback or carriage to address crowds in person and published statements in “broadsheets” and early newspapers. Lincoln had the relative advantage of traveling by locomotive or using the telegraph. Telephones appeared in the White House in 1877 while Rutherford B. Hayes was president. Like Harding, President William Taft used the phonograph to distribute recordings of his speeches. However, the most rapid advancement in communication for presidents occurred in the 20th century. [1. http://www.history.com/this-day-in-history/harding-becomes-first-president-to-be-heard-on-the-radio]

Those advancements are found in the introduction of radio, television, and later, the internet. Each technology had the power to change, for better or for worse, a candidate’s campaigns and influence on voters.

One of the most notable, influential presidential campaigns that took advantage of the media occurred in 1960, when Kennedy was running against Nixon. They participated in first ever televised Presidential debates known as “The Great Debates.” The debates were simultaneously broadcasted over the radio. Those listening to the radio declared Nixon the winner of the debate, and those watching the televised broadcast decidedly chose Kennedy as the winner. Why was there such a stark difference of opinion? On the radio, listeners judge the debates through speech and tone. With the introduction of television, there came all kinds of new ways to judge candidates: not just by what was said, but body language, eye contact, charisma, and of course, appearance. When it came to the newly developed judging criteria, Kennedy floored his opponent. Kennedy looked directly into the camera, whereas Nixon shifted his gaze to the side. Kennedy was tanned, and wore make-up; Nixon looked pale and sickly after just recovering from the flu. [2. http://www.history.com/topics/us-presidents/kennedy-nixon-debates]

Polls revealed that more than half of all voters had been influenced by “The Great Debates,” while 6% claimed that the debates alone had decided their choice. Whether or not the debates cost Nixon the presidency, they were a major turning point in the 1960 race—and in the history of media in campaigns. [3. http://www.history.com/topics/us-presidents/kennedy-nixon-debates ] 

Nixon-Kennedy-debates_1960[1. http://www.kingsacademy.com/mhodges/03_The-World-since-1900/11_The-Bewildering-60s/pictures/Nixon-Kennedy-debates_1960.jpg]

The slightly more recent past.

Social Media. Need I say more? Okay, I guess I do.

Barack Obama, coined the “President of Social Media”, garnered five million supporters on fifteen social networking sites for the 2008 election, with most of the “follower” count being on Facebook and Twitter. Prior to this election, neither of these platforms were used in campaigns. During his 2008 campaign, Obama launched an “Ask Me Anything” thread on popular site, Reddit, which became one of the most popular threads of all time. Obama and his team strategized to use these social media platforms to reach out to the young and minority voters. Upon his victory in the 2008 election, Barack sent a tweet “We just made history. All of this happened because you gave you time, talent and passion. All of this happened because of you. Thanks” – which was retweeted only 157 times. His later 2012 victory tweet (“Four more years.”)  became the most shared post in the site’s history, with over 400,000 retweets within a few hours of his posting. [4. http://www.theguardian.com/world/2012/nov/07/how-barack-obama-celebrated-twitter]

barack

This dramatic increase shows the incredible growth of not only users on twitter, but their online interactions in politics. Obama was the first candidate to embrace and effectively utilize social media in his campaign, and throughout his presidency. His strategy was so effective because “the medium wasn’t the message, so to speak; it was the vehicle. It connected with people, with real enthusiasm, in real time, and gave them an easy and accessible way to show their support for change.” [5. http://www.dragonflyeffect.com/blog/dragonfly-in-action/case-studies/the-obama-campaign/] Currently Obama has twenty aides that update his social media accounts.

Today.

The current campaigns for the 2016 Presidential Election featured the first “official” integration of social media of its kind – with Twitter. Sure, hashtags have been used widely for years – and Obama certainly capitalized on the use of Twitter during his campaign and throughout his presidency. However, this is the first election that Twitter officially partnered with the GOP and Democratic Debates.

cbs

This partnership featured live coverage of the events on Twitter. Users simply had to click on the #GOPdebate or #DemDebate links and they would be brought to a live twitter feed of coverage. The feed showed popular tweets using the hashtags, a “top stories” with photos and videos, and a sidebar for related articles to topics being discussed on the debates, provided by organizations such as USA Today, New York Times, and Fox News. (It is worth noting briefly that the organizations that were live tweeting were largely either reiterating the candidates’ claims, or in some cases, to support their own agenda, so to speak. @PlannedParenthood was especially active during the debates, either condemning the views of Republicans or praising those of Democrats. Clearly, the material being promoted on twitter was not bias-free.)

The Democratic Debate on Nov 14th, 2015 aired on the CBS network. Twitter and CBS linked together, and users had the unique opportunity to tweet to CBS using the #DemDebate hashtag. Tweets were pulled from the thousands sent, and some were read to the candidates to respond to – not unlike how reality shows in the same vein as  America’s Got Talent and The Voice display live tweets. Democratic candidate Hillary Clinton made a statement early in the debate, and a twitter user tweeted asking her to clarify her response. Hillary addressed the question and cleared up any miscommunication (well, so she hopes).

All of the candidates were tweeting during the debates. Rather, someone on their team was tweeting from their accounts. This gave candidates an extra platform to clarify or expand upon their responses in the debate. Hillary Clinton’s twitter account even tweeted at the start of the debate, “If you’re not watching the #demdebate, we can email you the highlights!” along with a link to sign up for her mailing list. Clever, Hillary. Clever.  

Okay, but does social media really make a difference?

Facebook claims to have increased voter turnout by 340,000 votes. And a third of those aged 18-24 indicated that reading something on social media would influence their vote more than televised debates. In the same age group of those online, 41% of users participated in political activity online. [2. http://www.huffingtonpost.com/r-kay-green/the-game-changer-social-m_b_8568432.html ]

In today’s world, not having a digital presence would be more detrimental than having one. Candidates that don’t use social media might come off as if they had something to hide. Erin Lindsay, a principal for digital at Precision Strategies, says social media “forces candidates to show more personality. Authenticity is a big thing in social media. I think the candidates that are the most successful are the ones that are clearly the most comfortable.” [7. http://thehill.com/policy/technology/251185-welcome-to-the-social-media-election ] Voters want someone genuine, and social media gives a way for the candidates to prove their authenticity.

The political advertisement spending is expected to reach 11.4 billion dollars for the upcoming election. Spending on social media is estimated to account for over half of the one million dollar budget for social media – a 5,000% increase from the 2008 election. [10. http://www.wired.com/2015/08/digital-politcal-ads-2016/ ]  With a budget this large, you can definitely expect a flurry of activity from the candidates on the networking platforms.

Facebook and Twitter have been the major sources of social media campaigning. However, this election is the first one in which we can see Instagram becoming an up-and-coming player. In November 2015, Instagram boasted having 400 million monthly users, as opposed to Twitter’s 316 million. [9. http://www.cnbc.com/2015/09/23/instagram-hits-400-million-users-beating-twitter.html]  The campaigning territory on Instagram is starting to be utilized, but still is not as popular as Facebook and Twitter when it comes to political activity. Imagine the mental anguish that would go into choosing the best photo filter…

From riding on horseback to constructing a (hopefully) carefully thought out tweet, candidates have embraced technology as a part of their campaign.

A Tale of Tweens, Teens and Technology Addiction

teens-and-technology

Did you know?

That 92% of teens report going online daily- including 24% who say they go online “almost constantly”. More than half (56%) of teens defined in this report as those ages 13-17 go online several times a day, and 12% report once-a-day use. Just 6% report going online weekly and 2% go online less often. www.pewinternet.org 2015

Tweens (pre-teens) and teens of today are being born into the technological age where the computer coupled with the Internet and mobile phones have become super information highways that are fast becoming substitutes for parenting, personal interaction, socializing and learning. My daily encounters always incorporate seeing teens on the train or bus perusing their phones constantly either alone or in groups. Most of the time these teens are on social media sites such as Facebook, Twitter or Instagram where one can write headlines and post statuses with pictures. Obviously this happens at home as well, and most parents have come to accept this as the norm. But how normal is this norm? How normal is technology addiction? Technology is not overall bad; it is the overuse of technology that makes it bad for not only our tweens and teens but also everyone.

According to Larry Diamond, “Digital ICT has some exciting advantages over earlier technologies. The Internets’ decentralized character and ability (along with mobile-phone networks) to reach large numbers of people very quickly..” (Liberation Technology).

Based on the statement above, technology has evolved immeasurably since the 2000s. Earlier technology gains included computer labs, phone booths and landline telephones amongst others. In the earlier days one would visit the library or schools to use the computer lab, but today, this is now an option. Likewise, phone booths and landlines have been phased out with the invention of mobile phones, which have become easily accessible. As a result, many children (tweens/teens) seldom ask for permission to go play outside or to a friend’s house. Instead they are given smart phones with a wide variety of Apps, games and social media and they ask for permission to have a Facebook or Instagram account.

Consequences of Technology Abuse by Tweens and Teens

 

tech22

  • Technology Abuse disrupts teenage learning

Technology has impeded our tweens and teens ability to think critically and be original in their ideas. For example, when faced with homework and assignments teens immediately go to the Internet to look for ready-made answers instead of researching information to stimulate their thoughts. This dependency on readily available information has created a new breed of what appears to be “lazy-thinking individuals” who tend to be intelligent outwardly but lack depth inwardly. There seems to be no attempt to think creatively, thus stimulating their cognition levels. Therefore it is not surprising that this teenage generation rely solely on “Google” to look up answers to simple questions such as “who is the 30th President of the United States?” We really should be looking among the tween/teens for future leaders who are strong-minded individuals capable of making wise decisions.

  • Technology Abuse is characterized as addiction

Jennifer Soong in her WebMD feature on “the Paradox of Modern Life” cites Healey (2009) who states that Internet Addiction can be explained as a psychological dependency that results from habitual or compulsive Internet use. Soong points out that even though Internet addiction is not yet recognized as a formal diagnosis, reports suggest that it is responsible, among other things for rearing a generation of impulsive individuals who are unable to concentrate – that is, it is negatively impacting their education, work and personal relationships. Meanwhile, it is Healey (2009)’s article in the Los Angeles Times entitled “Internet Addiction: a 21st century epidemic with some more at risk than others” that highlights the essential use of technology which is nearly escapable. Therefore Healey suggests protecting insatiably curious and eager to experiment learners since this addiction is a serious concern in this age of information.

  • Technology Abuse fosters poor social skills

In the teen link, kburt24 of Houston, Texas in writing about the question of  whether technology is hurting or helping social skills states that some technology can be useful for some interactions. It is stated that some kids allow the Internet to take control of their social lives and slowly, their willingness and ability to socialize face-to-face is decreasing. It is further stated that there is now a debate on this topic and researchers are worried that kids might be missing the importance of social cues and the ability to socialize without coming off as being considered socially awkward. This notion of being socially awkward is supported by Brown (2013) whose analysis shows that in the revised version of the DSM a new category of psychiatric disorder called “internet addiction disorder” has been proposed, thus highlighting the negative use of internet use. Brown (2013) cites a Professor of Communications at Alma College who reports that in the last five years there has been “erosion in students’ ability to focus and even their ability to engage in face-to-face interaction” (pp., 2-3). It is Wiesen (2014) in her Science learning article who states that some child development experts report that children who spend excessive time in front of screens are not developing the social skills they need to effectively handle interpersonal relationships.

In order to prevent technology abuse, I believe that everyone should be involved in this new developing phenomenon i.e. parents, teachers, children and schools. Parents need to become more tech-savvy and stop making excuses for their lack of computer skills; this will enable them to be more vigilant in their children’s technology use/overuse. Whether they are single parents or not this issue must be taken seriously and common goals should be expressed and implemented. Parents, set limits and boundaries on cell phone usage, set age limits for when your children can get cellphones, utilize time limits and even suggest getting a part-time job for of age children. In addition, schools and teachers can work hand-in-hand; stricter cellphone usage guidelines in schools should be monitored and there should be guidance in the use of “when” and “how” cellphones are used during school hours. Allocation of time should be organized that students are given ample opportunity for social interaction and discussions. This would aid in fostering the enriching learning environment needed by our tween and teens.

 

 

References

Brown, C. (2013). Digital Commons. Retrieved December 7th, 2015, from www.digitalcommons.com: http://digitalcommons.conncoll.edu/psychhp/40

 

kburt94. (n.d.). teenink. Retrieved December 7th, 2015, from www.teenink.com: http://www.teenink.com/opinion/current_events_politics/article/162125

 

Lenhart, A. (2015, april 9th). www.pewinternet.org. Retrieved December 7th, 2015, from www.pewinternet.org: http://www.pewinternet.org

 

M, H. (2009, October 5). LA Times. Retrieved December 7th, 2015, from www.latimes.com: http://latimesblog.latimes.com/booster_shots/2009/10/internet-addiction-a-21st-century-epidemic.html

 

Soong, J. (2005). WebMD LLC. Retrieved December 7th, 2015, from www.webmd.com: http://www.webmd.com

 

Wiesen, N. (2014, April 15). Science Learning. Retrieved December 7th, 2015, from www.scilearn.com: http://blog/social-skills-digital-age-screen-time

 

 

 

Sciencescape’s Approach to Big Data in the Realm of Scientific Research

The current image of consumer technology, personal computers and front-end applications, is one of simplicity; small boxes that house smaller components and microchips, departing further and further away from the hulking, esoteric, push-pin and vacuum tube monoliths of the past. This is, of course, ideal, as the growingly tech minded society compels its citizens to participate more wholly within its self-designated bounds.

 

Even Google’s server rooms express the hushed, confident, uniformity of its homepage

 

But this picture belies the massive data swarm on the other side of the front end. We might only catch glimpses of it through our interfaces, but Big Data is consuming the world, for good and ill.

In their article “CRITICAL QUESTIONS FOR BIG DATA,” Danah Boyd & Kate Crawford characterize the concept:

We define Big Data as a cultural, technological, and scholarly phenomenon that rests on the interplay of:

(1) Technology: maximizing computation power and algorithmic accuracy to gather, analyze, link, and compare large data sets.

(2) Analysis: drawing on large data sets to identify patterns in order to make economic, social, technical, and legal claims.

(3) Mythology: the widespread belief that large data sets offer a higher form of intelligence and knowledge that can generate insights that were previously impossible, with the aura of truth, objectivity, and accuracy (663).

This interplay implies the Big Data approach to research and thought, where the potential of technology to facilitate, manipulate, process, and analyse torrents of data roots the belief that Big Data holds meaningful value in understanding the world thus constructed.

Sciencescape very much falls under the banner of this paradigm: it’s data set, the universe of scientific academic research and journal articles. I am intensely interested in the site, because of its attempt to organize, for our ease of access, a nebula that doubles in output every nine years (Van Noorden). To this end, the site claims to have indexed 24 million articles.

This number, in itself, is not impressive, as big academic indexes are common. Instead, the novelty arises from Sciencescape’s method of presentation and management. Here is a video from their About section:

From this clip, it appears as though Sciencescape wishes to establish a web of scholarly data through computational algorithmic methods to pick out key concepts such as subjects, authors, institutions, etc., as a means to order them in a way that exposes their topic, field of study, and impact. Big Data methodology for Big Data sets, Sciencescape works to provide meaningful orientation in the realm of science at a level that would seem too grand for more traditional approaches. The site offers free accounts to access to their database, so let us now explore the site’s aspirations in their implementation.

The way we will discover the functionality of the site is through a contemplation of the elements as they present themselves on the guided tour of Sciencescape. I do this not only to easily distinguish features of note, but as a means to peer into the minds of the site’s creators; I want to get an idea of what they think as valuable, novel, and unfamiliar, in their approach to Big Data analysis.

First, the overall presentation of the site before log-in feels very much like the now familiar startup interface (though the company’s headquarters is in Toronto).

Capture

 

 

Very professional and enticing toward potential users and investors alike, this web page contains all the standard links to the information about the project and those behind it. Clicking on the about tab allows a user to learn about the functionality of the site. Of particular note, the publishing partners sub-tab:

 

Capture2

 

From the about tab, we are also able to access the guided tour, which allows us to bypass log-in.

PAPERS

The first stop is a page for an individual paper:

 

Capture3

 

This particular web portal holds many interactive elements. For instance, we can follow the writers and journal, into the past and into the future when new publications are added. If the paper is available to us, then we can receive a .pdf. Other web 2.0 mainstays like “Add to Library,” “Broadcast,” and “Bookmark” also appear, as well as buttons to facebook, twitter, and email.

At the bottom of the page, there are lists of citations to and from this specific paper, but also “recommended readings” and “fields.” I am not certain as to what factors go into the recommendations, perhaps relevancies with authors and topics, but the “Fields” tab interests me the most. The list is populated with associated concepts: all of which expand through time, in order to encompass other articles.

FIELDS

The next main section of the tour presents an example of these “Fields.” Displayed is the timeline associated with AIDS research:

Capture4

These field pages, I think, really display the greatest value in Sciencescape. This research timeline shows the sudden jump in scientific awareness of AIDS during the period of April-July 1983, when 825 citations mentioned it. The three months before show only 30 citations. This tool is immediately helpful in accounting for the history of the disease as science has seen it.

The other significant value on this page is the “Eigenfactor,” which Sciencescape uses to rate papers’ impact. In the field of AIDS research, we see that a paper published in May 1983 influenced others to the greatest amount.

The Eigenfactor is one of the defining characteristics of whole engine:

The Eigenfactor algorithms take into account not only the citations, but where the citations come from.  This is similar to how modern search engine algorithms rank websites.  A citation from a highly referenced journal, article or author carries more weight than one from a journal nobody has read (https://sciencescape.org/eigenfactor).

With this factor, scientists have a potential metric for significance that accounts for the entire web of data, rather than just explicitly referenced citations.

THE FINAL STOPS

The last two sections of the tour deal with the facilities in place to allow someone to keep up to date on her research. She can follow elements, such as journals, universities, authors, and fields. New papers associated with these concepts then appear in her Feed as they are published. Also, an extra level of organization exists; called “Streams.” these groups can be constructed by the follower, as a means to further link the items of which she follows. She can sort by these streams to focus her attention on one area of association.

Capture5

Overall, the tour was promising to me, as an information scientist. Ideally, this specific conglomeration of social and semantic tools should increase relevancy and visibility over the extreme output of scientific research. Nevertheless, I worry about the implementation of such algorithms over such big data sets; I am not a bio-scientist, and so I cannot judge how well Sciencescape’s metrics work in coordinating fields and impacts. Certainly, more inquiry is needed, but the concepts that underlie the site are compelling.

 


Works Cited

danah boyd & Kate Crawford (2012) CRITICAL QUESTIONS FOR BIG DATA, Information, Communication & Society, 15:5, 662-679,

Van Noordan, Richard “Global Scientific Output Doubles Every Nine Years.” News blog. Web. 30 Nov. 2015. <http://blogs.nature.com/news/2014/05/global-scientific-output-doubles-every-nine-years.html>

Digitization and the NYU University Archives

On the 10th floor of Elmer Holmes Bobst Library lies a well-kept secret: the NYU University Archives. With only two full time staff, four part-time graduate assistants, and one part-time undergraduate student, the department is incredibly lean. This small staff supports roughly 800 unique patrons per year, with 55% of patrons affiliated with NYU, and 45% coming from outside the University. Though only formally created in the late 1970s, the department has been collecting materials since the University’s founding in 1831. The Archive is home to a range of collections, including items such as architectural renderings, administrative records, realia, busts, posters, audiovisual materials, mascots, and more. The Archive houses their collections onsite at Bobst and in a storage facility at Cooper Square, with 40% of the collection stored offsite in a warehouse upstate. Unlike a rare book and special collection reading room, the University Archives have a fairly flexible admission policy. Their reading room is open Monday through Friday, 9:30am to 5:00pm, by appointment only. However, this collection is not restricted to patrons with specific credentials (ie. academics), but instead is open to anyone with an interest in the Archive’s holdings.

Visiting NYU’s University Archives definitely brought up issues of access. The archives are open five days a week during normal business hours. A visit to the collection for anyone with a 9-5 weekday job would be impossible. Janet Bunde, the interim University Archivist at NYU, explained that the Archive hopes to extend their hours in order to better accommodate patrons who are only available on weekends. Digitization, of course, serves as a successful remedy to the challenges of physical access. However, a crucial issue facing many university archives and special collections, including the NYU University Archives, is the lack of funding available for digitization of records. NYU has only digitized an extremely small portion of their university archives, including 1,400 photographs, a small number of 35mm films, and a few documents. The collections that NYU has digitized have been made easily accessible through their digital finding aids, and are clearly organized and presented.

NYU University Archives  (from NYU Archives website)
NYU University Archives (from NYU Archives website)

I applaud Bunde and her staff for their hard work to make their archives accessible to all, and it seems that they are making great progress toward opening the collection to an even broader audience. There are still, however, major issues across institutions as to priorities of funding. Where staff constraints affect such issues of access as reading room hours, digitization serves as the antidote. There is a one-time labor commitment of digitizing and uploading materials to a server, which results in inestimable hours of use by patrons. This includes, of course, the labor of website and database maintenance, but one could argue that this time is still minimal compared to the perpetual staff needs of operating a reading room, retrieving physical materials for patrons, and providing face-to-face reference support (which of course has it’s positives, as compared to digital reference support).

Another challenge that Bunde discussed was determining which materials are necessary to preserve. In a university as large as NYU, there is a significant amount of material that must be sorted through and culled on an annual basis. As the university is able to bolster its digitization efforts, issues of quantity will become even more pressing. As Roy Rosenzweig states, “the simultaneous fragility and promiscuity of digital data requires yet more rethinking –about whether we should be trying to save everything, who is ‘responsible’ for preserving the past, and how we find and define historical evidence” (Rosenzweig, 2003). This will become a challenge that Bunde and her staff will have to grapple with as they are able to increase their digitization efforts.

Ultimately, I was very impressed with the NYU University Archives’ commitment to open access for all. Bunde and her staff created a welcoming environment for researchers who may not be regular patrons of university special collections. As Joan Schwartz and Terry Cook state, “archives –as records –wield power over the shape and direction of historical scholarship, collective memory, and national identity, over how we know ourselves as individuals, groups, and societies” (Schwartz & Cook, 2002). I believe that NYU has worked to create an archive that is inclusive of the many dialogues of NYU and the surrounding community, both historically and in current day. I was impressed with their commitment to functioning as an archive for the community, and I look forward to seeing what types of materials their future digitization projects bring to light.

References

Schwartz, Joan, and Terry Cook. “Archives, Records, and Power: The Making of Modern Memory.” Archival Science 2.1-19 (2002): 2-13. Print.

Rosenzweig, Roy. “Scarcity or Abundance? Preserving the Past in a Digital Era”. The American Historical Review 108.3 (2003): 739. Print.

 

The Electric Book Parade

Two ideas: Marshal McLuhan’s idea of mediums as extensions of ourselves, as technical designs “amplify or accelerate existing processes” so the pace and scale magnifies, i.e. railroads and cars enlarging the human function of movement. Essentially, our central nervous system has manifested itself with electric technology, and yet it lays itself bare to overt mechanization. This notion couples well with Walter Benjamin’s idea that the product of mechanization lacks the presence of space & time, therefore, authenticity. The mechanical reproduction of a work of art is depreciated, just as a landscape cannot transfer its unique aura to a picture. These two ideas are subtly entangled within the contention between bound books & e-books.

Convenience is the glory of our current era, and e-books are a product of this value; after all, a whole library is accessible from one gadget with web features, PDF downloads, and a light weight design. Many college students rejoice as they no longer have to lug around hefty textbooks with lofty prices, sparing their backs and bank accounts. Moreover, e-comfort rubs our technophilic itch, efficiently slicing away analog drudgery. So why would a respected professor of journalism have e-books banned from her classes? Quite simply, because printed books are a “better interface” for seminar discussions; everyone works off the same page, otherwise students fiddle away class time with battery issues, interface hiccups, and searching relevant quotes. Also, using the same textbook contributes to overall class cohesion as discussions can concentrate naturally without e-distractions, bolstering the dialectical process of face-to-face interaction, which is a prime value of traditional education. It’s also reassuring to know the classes’ undivided attention is shared, whereas electronics allows one to wander. The ability to e-wander breaches the unwritten rule of classroom etiquette, that is, being present without escape. This exemplifies the difference between mediums: books are based on communication between minds, and e-readers are based on efficiency and amusement.

Another example of e-distraction comes from a recent study of middle school students. It revealed that those who read e-books retain less compared to students reading the same content in print. The e-reading students were unable to retell what they read. Researchers believe the “flashy gimmicks, fun interactive designs and ability to wander from the text distract readers from the task of actually, well… reading.” The message is filtered through the medium, a bound book is an object whose sole purpose is to be read. An opinion piece from Publishers Weekly echos this e-deficiency as the snarling author quickly connects technophilia with the degeneration of our minds, as we can no longer read (let alone write) War & Peace type literature because of our chronic scatterbrains induced by e-readings (blogs, tweets, emails, text messaging etc.). He believes we are straying from the supple & subtle mind that gave us the Renaissance, the Enlightenment, and Modernism, “because our brains can no longer think beyond a tweet,” nor read beyond one as well.

Moreover, there are issues of freedom involved as well. Richard Stallman, a software engineer and founder of the GNU project believes e-books are a “step backwards” from printed books. Every advancement in technology furthers corporate infringement upon our freedoms, specifically privacy and ownership. For example:

– anonymity is impossible as user identification is required.

– instead of ownership, identified users sign a contractual agreement for restrictive licensed access.

– the format on e-book devices is proprietary, therefore restricted to specific software

– you can’t borrow books as traditional lending is not allowed without a gauntlet of red tape

– your e-book can be deleted without notice

Stallman simply believes companies should respect the freedoms of individual citizens, which of course is possible, and he posits solutions: offer authors direct payment from e-book users, and/or distribute funds based on author’s popularity. But we needn’t forgo rights that were vehemently fought for in the name of commercial convenience and e-amusements. After all, quantitative forces are behind e-readers, as profit and general statistics of readership are its root value, but this is strictly a mercantile approach that hardly considers qualitative factors that are the source of humanity’s greatness; a certain ineffability that is intrinsic to the qualities of being alive. These values are being brushed aside, along with privacy and ownership, for e-novelties.

Progress in science and tech inherently has the new superseding and replacing the old. Whereas the excellence achieved in the arts does not die with time, but continues breathing and enriching future generations.

Perhaps this face-off between e-readers and books is merely a consequence of the notion of progress, which is inseparable from science and technology, but is inappropriately applied to the arts. Progress in science and tech inherently has the new superseding and replacing the old. Whereas the excellence achieved in the arts does not die with time, but continues breathing and enriching future generations. A great work of art may inspire others to supersede it, but it’s impossible to replace it; for every influential work of art there is a barrel of books that critique, copy, satirize, adapts and reinterprets the work, most of which falls into the shadows of obscurity.

For literature specifically, the bound book is the best home for the life of ideas. Ideas that are revealed and preserved in words. Think of the Gutenberg bible on display at the NYPL main branch, you will never see an e-book displayed in such a way, even if it’s an exact replica of an incunabulum. The uniqueness of this specific book is inseparable from its material, thus its aura is irreproducible. The physical object is part of the artistry, authenticity and experience, something which e-readers are not capable of nor made for. Yet they are made for harnessing the prolific data stream, and e-everything is ushering us into our profane future without the aura of our traditional past, as it’s too slow, esoteric, and mythopoetic to serve the current mercantile/quantitative/vaudevillian value systems. However, every raging river has its side streams and quiet creeks, and this is where the literati will continue to thrive. And yet, there is a bigger question lurking about: to what degree are we willing to mediate being alive through a screen?

Makerspaces in the U.S. and China

Recently, I visited 3 libraries, on a tour of makerspaces in Northern New Jersey: Parsippany Public Library, West Caldwell Public Library, and Hillsdale Public Library. Parsippany’s makerspace was strictly high-tech (a scanner, Mac laptops, a laminator, a scan and cut machine, and a high-quality printer). Every piece of technology was purchased with funds from a $10,000 grant that was awarded by the New Jersey Library Association. The Head of the Teen Services Department informed me that most of their patrons use the makerspace to create “unique greeting cards.”

West Caldwell Public Library calls their $12,000 worth of audio recording equipment (packed away in boxes) a “makerspace,” as they work toward a community-based oral history project (participation by invitation only). This space was also made possible by a grant, from the Rotary Club of the Caldwells (RCC), a local association.

Then, there was Hillsdale Public Library, with the most effective implementation, the highest engagement, and, as you may imagine, the most inflated funding. Their makerspace includes the following:

  • High Tech:
    o Makerbot Replicator 2 (~$2,000)IMG_6272
    o Button Maker (~$300, Gifted from The Friends of the Library)
    o 27” iMac (~$1,800 plus accessories: Yeti USB microphone by Blue ~$115, JVC headphones ~$50, Logitech USB HD camera ~$100, and loads of expensive software, Gifted from The Friends of the Library)
    o Singer Curvy Sewing Machine (~$200)
    o Brother PE-770 embroidery machine (~$700)
    o Silhouette Cameo Cutter (~$300)
    o 4 Premium LittleBits Kits (~$150/each)
  •  Low/No Tech:
    o Tools: scissors, razor blades, needle nose pliers, wire cutters, rulers, cutting mats, screw drivers, glue guns, paint brushes, Rainbow Looms.
    o “Connectors”: Elmer’s glue, glue sticks, duct tape, masking tape, scotch tape, paper clips, magnets, wire, soldering irons, string, fishing line, yarn, thread.
    o Other: paper, cardboard, origami paper, corks, ping pong balls, golf balls, wood craft sticks, springs, fabrics, metal bottle caps, plastic bottle caps, polymer clay.

Most of these supplies were made available via the efforts of The Friends of the Library organization, who also donated funding to develop elaborate “Makercamps,” as well as library furniture, museum passes, all summer reading programs, and more.

Throughout the afternoon, I watched an elementary-aged child 3D print a superhero emblem ring, and an elderly woman make button magnets out of her watercolor paintings (under a sign that read, “Express Your Quirky Personality”). I was inspired by the creative energy, but I was also trying to imagine how a similar space would be achieved, and made useful, by urban, low-income areas like my own. It would not be possible for my community to pool seemingly endless funds from an organization like The Friends of the Library, and anyway, they have far different needs, and objectives, than those expressed through the creation of do-hickeys on a $2,000 3D printer. It just did not seem realistic.

Looking for guidance on how to adapt Hillsdale’s model to suit my own library, I asked David J. Franz (Director) about their policies regarding the theft of materials. He curtly said, “What do you mean? We’re giving this stuff away!” I was confused, but supposed that he was referring to supplies such as paper or, glue, and so, gave the example of LittleBits, of which they had several kits sprinkled across a table, any of which would be rendered useless with the absence of one, tiny piece. He stopped to think, and said, “Theft just isn’t a problem for us. I guess that you would have to consider your community.” Exactly. Once, my library tried to set-up a “Bookmark Contest” station, and within 8 hours, every pen, pencil, marker and cardstock template disappeared, without a single entry in the box. Now, my point is not that our patrons steal; my point is that I was struggling to imagine this type of makerspace in our community.

IMG_6271

While on the bus (full of 30 white, female librarians… but that is a different blog post), riding between each library, I was reading Larry Diamond’s “Liberation Technologies,” and thinking about how publicly accessible technologies present new possibilities for social action. Diamond writes, “Liberation technology enables citizens to report news, expose wrongdoing, express opinions, mobilize protest, monitor elections, scrutinize government, deepen participation, and expand the horizons of freedom” (2010). Some examples of this “liberation technology” resemble offerings within makerspaces, including digital cameras, audio recording equipment, access to YouTube, etc. Also, makerspaces have emerged, and are utilized, in the same authoritarian states discussed by Diamond, including China. Knowing the larger realm of possibilities for makerspaces, and inspired to learn more about how they may be put to use in other areas, I did some reading about their implementation in China.

In 2010, China’s first makerspace, XinCheJian, was established, with 6 more to follow over the next 6 months. Each included 3D printers, laser cutters, and other machinery, but the similarities to Hillsdale Public Library stop there. Co-founder, David Li, says that, within XinCheJian, and even on the streets in the surrounding city, there is

“a real-dealing with e-waste. Not just this elite form where people promote reuse, because they want to feel good about themselves purchasing a new phone every 6 months. Here, people reuse on a daily basis discarded parts and fix broken machines rather than buying new stuff. It’s making out of necessity. It’s open source hardware in practice. This is different from the West where open source hacking only exists in theory. Here, the actual maker in the factory is involved, the workers, the repair guy on the street” (Lindtner, 2015).

In stark contrast to XinCheJian, Hillsdale Public Library’s “recycled” or, “re-used” materials were in such great quantity, that it did not seem likely that many of them were actually amassed via second-hand means (ex. thousands of clean popsicle sticks were considered recycled materials, and the metal bottle caps were all clean, seemingly unused, with a unbranded, silver finish). At XinCheJian, 9-year-old boys (Hillsdale’s largest group of makerspace users) are not connecting LittleBits to power a tongue-wagging teddy bear; these are adult innovators with access to industry, and a manufacturing mindset, born out of Chinese tradition. At XinCheJian, for example, a recent project was the construction of an aquaponic planting system, following a design that could be easily put into production.

This difference goes beyond XinCheJian; some driving forces behind China’s makerspace trend are two open-hardware companies, DFRobot and Seeed Studio. Both of these businesses focus on bringing China’s manufacturing culture into the hands of hobbyist makers and start-up companies. Compare this to Arduino and MakerBot Industries (businesses behind the U.S. maker scene), which focus on “hobbyist production, prototyping, and tinkering” (Lindtner, 2015).Lindtner emphasizes “that DFRobot and Seeed Studio, in taking manufacturing itself seriously as a source of knowledge and expertise, did not only develop a niche business but also performed important cultural work” (2015). With the introduction of these two companies, and the emergence of makerspaces in China, China’s reputation for cheap, low-quality production was suddenly challenged by expert-level innovation.

Clearly, regardless of physical location, the impact of a makerspace requires more than a button-maker, and a colorful entryway. As Diamond writes, “It is not the technology, but people, organizations, and governments that will determine who prevails” (2010). Despite having very similar technological access, makerspaces in the U.S. and China, have different cultural objectives, and thus, different outcomes, which illustrates the fact that the relevance of a makerspace relies on the development of a community, rather than furious grant-writing to acquire iMacs.

References

Diamond, L. (2010). Liberation technology. Journal of Democracy. 21 (3), 69 – 83. Retrieved from https://lms.pratt.edu/pluginfile.php/514973/mod_resource/content/1/21.3.diamond.pdf

Lindtner, S. (2015). Hacking with Chinese characteristics: the promises of the maker movement against China’s manufacturing culture. Science, Technology, & Human Values. 40 (5), 854 – 879. DOI: 10.1177/0162243915590861

Listening in at an Audio Archive, an observation

The Rodgers and Hammerstein Archives of Recorded Sound at the New York Public Library of Performing Arts is the second largest archive of recorded sound in the United States.  It is home to a wide range of recordings including but not limited to music in just about every genre, recordings of theater, opera and comedic performances, oral histories, speeches, radio broadcasts and field recordings. The archive holds recordings on every kind of format from wax cylinders to shellac discs, magnetic tape, cassettes and digital audio files.

The collection can be accessed by visiting the third floor of the Library of Performing Arts and making a listening request at the audiovisual desk.  Patrons must look up the title, author and class mark, write it down and present a request slip to the library assistant.  Everything is classed according to its format for efficient shelving, not according to genre, record label or subject.  It is easiest to find a recording if one knows the specific track or artist she is looking for.  The online catalog is not designed for browsing like one might do in a record store.  Visitors can search via a massive card catalog or the song index that is also housed in card catalogs.  The card catalogs, though rarely in visible use, still provide something a little more like a browsing experience for those wishing to stumble upon something unexpected.  In addition to the catalogs, one can also peruse finding aides for different collections within the collection.  However the finding aides are varied, and some have very little information listed about what a particular title actually contains. Some of the finding aides have handwritten notes or corrections from previous researchers.  The sheer volume of material is astounding and somewhat overwhelming.  It is truly an amazing and treasure trove of a collection.

After making the listening request, a listener is given a set of headphones and is assigned a seat at a numbered listening station.  The listening stations are equipped with computers that have a special software program installed on them.  A patron must wait while the requested audio is collected from the vast archive that is located in the basement of the building.

A little known fact is that library staff known as the playback team are waiting in the basement to retrieve and play back audio for patrons.  They find the requested material and in the case of vinyl or shellac discs or audio reels, they also operate the playback equipment.  The playback equipment in the basement is connected to the computers on the third floor so that listeners can hear the requested sounds without actually handling the sometimes fragile audio carriers.  The computer software allows listeners to scroll or fast forward through digital audio files during playback, however if a listener has requested a vinyl LP for example, the listener must indicate which track he or she wants to hear via a messaging service on the computer screen.  The playback staff is notified of the listener’s message with a little “beep” and will move the needle to the the desired track on the record.  This can sometimes prove a little difficult for staff when patrons ask to hear specific tracks or parts of tracks repeatedly for their research.  Many patrons assume the entire system is computerized and do not realize the human labor involved in bringing the sounds to their ears.  They do not always understand why it might take a little time to process their request, in these days where messages are sent into space and back in fractions of a second.  Some that do understand the situation send humorous messages to the playback team via the messaging system, like “Dear Audio God, please play the next track.”

Listeners can stay for as long as they like during opening hours.  Some researchers, having made special trips from other parts of the country or abroad will stay the full day or multiple days, only taking short breaks to have lunch in the library cafe.  They are trying to get through hours and hours of material during the short time they have in New York.  While video or photos allow one to quickly scan and find points of interest, it does not work the same way for audio, particularly during interviews or field recordings.  One must sit and listen in real time, unless the audio has been logged or transcribed.  Recent developments in automatic transcription and partnerships with organizations such as Pop Up Archive may prove very useful for researchers in the future.

While the collection holds such a wide array of fascinating recordings and most likely has something of interest to just about anyone, it does not seem that there are many casual listeners or members of the general public who stop in to sample what the archive has to offer.  Lack of  awareness of the collection and accessibility are two issues that perhaps lead to less enjoyment and use of the RHA holdings.

In his article, “The User Experience,” Aaron Schmidt defines user experience as “arranging the elements of a product or service to optimize how people will interact with it.”  Librarians, curators and archivists working with audio collections must think about how people want to interact with the sounds in their collections.  Copyright issues, conservation, audio formats and accessibility are all issues to consider when planning out how audio collections will be encountered and experienced by library users.  In what ways do people want to listen?

From the user experience perspective, one issue members of the public must face is gaining access to the spaces where they can hear the recordings.  To access the listening stations, patrons must first place their belongings with security.  Then, as outlined before they must make a request to hear the material, some of which may or may not be immediately available since some recordings must be digitized before playback is allowed. This situation may not be a problem for researchers familiar with library procedures who need access to the recordings in order to carry out their work.  However, what about the patron who may not even know the collection exists, considering that it is located in a locked basement doors and difficult to browse online?  Audio collections tell fascinating stories through words, sound and music.  However, without more focus on user experience, they may go unheard.  Listening spaces in libraries are in need of an update.  As audio technology becomes less expensive and more widely available, why aren’t library users offered more options for listening?  Innovations in audio technology can raise awareness of collections, improve accessibility and offer library patrons new ways of listening.  How do you want to listen?

Algorithms and Ethics: Brainstorming Solutions

Algorithms are everywhere. Of particular interest, algorithms that are used “to select what is most relevant from a corpus of data composed of traces of our activities, preferences, and expressions” can be extraordinarily powerful tools.[1. Gillespie T. (2014). The relevance of algorithms. Media Technologies: Essays on Communication, Materiality, and Society. Eds. T. Gillespie, P. Boczkowski, and K. Foot. Cambridge: MIT Press, 167–194.] Such algorithms determine advertisements seen online or received in the mail, posts that appear prominently on social media feeds, even hiring and firing decisions. They impact innumerable aspects of many people’s daily lives. And, as one recent post from the University of Oxford’s “Practical Ethics” blog noted, the way algorithms “function and are used . . . whether in computers or as a formal praxis in an organization – matters morally because they have significant and nontrivial effects.” [2. Sandberg, A. (2015, Oct. 6). Don’t write evil algorithms. (Web log post). Retrieved from: http://blog.practicalethics.ox.ac.uk/2015/10/dont-write-evil-algorithms/.]

Many algorithms provide a great benefit to our society, helping human beings to organize and simplify a constantly-expanding and complicated universe of data. In some situations, however, they can also have adverse and inhumane effects – for example, by invading individuals’ privacy or producing results based on incomplete or otherwise flawed data. Accordingly, all involved parties – information technology innovators who create algorithms, corporations that make use of algorithms for business gain, technology consumers whose use of algorithm-enhanced products has catalyzed the present ubiquity of such systems – have an obligation to think about and develop ethical approaches to the current landscape. How do we even begin to approach this enormous task?

In March 2015, the Centre for Internet and Human Rights and the Technical University of Berlin hosted a conference on “The Ethics of Algorithms,” at which academics and technology professionals from the United States and Europe grappled with these very issues. A background paper from that conference identified a subset of algorithms that are of the greatest ethical concern, and the specific attributes that require heightened scrutiny: “complexity and opacity, gatekeeping functions [determining ‘what gets attention, and what is ignored’], and subjective decision-making.” [3. Center for Internet and Human Rights.  (March 2015). The ethics of algorithms: from radical content to self-driving cars. (Final draft background paper). Retrieved from https://www.gccs2015.com/sites/default/files/documents/Ethics_Algorithms-final%20doc.pdf.]

That same paper also proposed a handful of appropriate regulatory responses to problematic algorithms, weighing the pros and cons of each. This provides an excellent starting point for any discussion of the ethical challenges of algorithms.

The first proposed response is “algorithmic transparency and notification.” Transparency in algorithms is a challenging proposition – in part because most algorithms are so complex that lay people would not be able to understand them even if they were opened up to scrutiny. In addition, many programmers and corporations keep the secrets of their algorithms close to the vest and would not give them up without a colossal fight. While some openness is a fantastic goal and is necessary for a dialogue about ethical algorithms, on its own this is not a realistic or adequate solution. An alternative to full transparency, however, is “notification,” which envisions more consumer engagement with the manner and extent of data provided to algorithms: “Consumers can demand for control over their personal information that feeds into algorithms which might have a considerable effect on their lives. This includes the rights to correct information and demand their personal information to be excluded from the database of data vendors.”

A second response, “algorithmic accountability,” asks that we question how and why algorithms work as they do: “causal explanations that link our digital experiences with the data they are based upon [which] can empower individuals to better understand how the algorithms around them are influencing their life-worlds.” Indeed, the conference paper describes investigations as to how algorithms produce certain outcomes, even if such investigations do not create a definitive explanation, as an “essential precondition for the public scrutiny of algorithms.”

Finally, the paper approaches the possibility of “governments directly regulating an algorithm,” with regulation of algorithms in the financial sector as one appropriate example. This regulatory approach becomes more complicated, however, if applied to government regulation of search engines: “even deciding what would be in the ‘public interest’ is a complex and contested question, exactly because there is no right answer to how [a search engine] should rank its results.” Such regulation would be controversial and difficult (if not impossible) to manage. It could also serve to discourage innovation in the development of algorithms, at a time when we should foster creativity and flexibility among programmers.

None of these regulatory responses is perfect. What this discussion does make apparent, however, is that algorithms are valuable yet imperfect tools and, especially as they become increasingly central to our lives, they should be scrutinized through a lens of fairness and ethics.  

Indeed, as the previously referenced University of Oxford blog post puts it: “We cannot and should not prevent people from thinking, proposing, and trying new algorithms: that would be like attempts to regulate science, art, and thought. But we can as societies create incentives to do constructive things and avoid known destructive things.”

Some awareness of the impact of algorithms on humanity, both positive and negative, can go a very long way, along with consideration of our ethical obligations as the drivers of the algorithm environment. The most important thing is to not fall into a trap of thinking about algorithms – as autonomous as they may appear when designed skillfully – as something independent of their human creators, for which humans do not bear full responsibility.

As Tarleton Gillespie recommends in the article The relevance of algorithms, we “must unpack the warm human and institutional choices that lie behind these cold mechanisms . . . to see how these tools are called into being by, enlisted as part of, and negotiated around collective efforts to know and be known.” Such inquiry can help us make ethical choices about our use of algorithms. When used thoughtfully, algorithms can be an extraordinary tool for the common good.

Cultural Production and its Discontents: Copyright, Commerce, and Invisible Labor

In Digital Disconnect: How Capitalism is Turning the Internet Against Democracy Robert McChesney (2013), Gutgsell Endowed Professor in the Department of Communication at the University of Illinois at Urbana-Champaign, broadly characterizes copyright in the internet age as a tool used by media conglomerates to gain monopolistic control of digital channels of dissemination of cultural products. He describes the political machinations of the “copyright lobby” on behalf of the corporate media sector as, largely, an infringement of the “openness and egalitarianism” the internet initially promised, and consequently an assault on its potential to contribute to democratic culture and self-government (p. 124-125).

McChesney’s critique of corporate media’s use of copyright brings up the work of two other scholars: The Wealth of Networks: How Social Production Transforms Markets and Freedom by Yochai Benkler (2006) and “Critical information studies: a bibliographic manifesto” by Siva Vaidhyanathan (2005) which appeared in the journal Cultural Studies. In those works, both scholars identify the emergence of internet age information and communications technology as ushering in revolutionary possibilities for cultural conversation and production. Both emphasize the radical possibilities for creative collaboration that internet channels provide, and both identify implementation and expansion of copyright laws in the United States by entrenched media interests as serious challenges to realizing the full creative potential of those channels. Furthermore, like McChesney, both scholars link effective use and regulation of new information technologies to the responsible expansion of democracy.

Noticeably absent from all three scholars’ works is a substantive treatment of the principle of economic gain for individual authors as a vehicle for public benefit that, at least expressly, underlies copyright in the United States. In Copyright and Cultural Institutions: Guidelines for Digitization for U.S. Libraries, Archives, and Museums, authors Peter Hirtle, Emily Hudson, and Andrew Kenyon (2009) quote the Supreme Court’s explanation:

The economic philosophy behind the clause empowering Congress to grant patents and copyrights is the conviction that encouragement of individual effort by personal gain is the best way to advance the public welfare through the talents of authors and inventors in “Science and useful Arts” (p. 4).

It is clear that media conglomerates use and manipulate copyright through political pressure to consolidate their economic domination. It is also clear that such companies, like other corporate players in cultural production, negotiate through contracts and employment conditions with the individual actors who create, collaboratively or alone, the cultural products those companies offer to the marketplace and over which they claim and, legally, hold ownership. McChesney, Benkler, and Vaidhyanathan decry the oppressive and self-interested actions of those companies and their effect on the potential for unfettered creative work and innovation for the public good. But in the internet of “openness and egalitarianism” towards which those scholars write, would individual creators be adequately financially rewarded for their labor and thus incentivized to pursue creative work?

Initial indications are that a society and its financially interested parties that emphasize freely accessible information over the rights of copyright holders tend to deliberately obscure the labor of individual creative workers and thus rationalize those workers’ lack of economic gain from the cultural products they create. This is evidentially the case in the internet age, when individuals are encouraged to donate their labor to cultural projects, and are contracted by hour or by task by companies that use them as tools rather than employees. The issue is exacerbated by the invisibility of individual labor when rendered in digital form. As James Moor (1985), Daniel P. Stone Professor of Intellectual and Moral Philosophy at Dartmouth College, wrote in “What is computer ethics?” in the journal Metaphilosophy, “Most of the time and under most conditions computer operations are invisible. One may be quite knowledgeable about the inputs and outputs of a computer and only dimly aware of the internal processing” (p. 6). The invisibility of computer operations also obscures the human labor involved in digital products. Creative cultural products rendered and disseminated digitally appear more and more like public goods, and less and less like individual creations. The October 2015 ruling in Google’s favor by the New York federal appeals court vis-a-vis Google Books and Google Book Search, while reasonable, does not treat the disrespect of copyright undoubtedly shown by Google when it digitized entire libraries of books under copyright, nor does it have any bearing on the company’s prospective future plans for the full scans of those books which it still holds. It seems that Google may be able to take the resources it wants regardless of legal ownership or procedure, and fight to a favorable resolution while counting legal fees as simply a cost of doing business.

Benkler (2006) warns that “The freedom of action for individuals who wish to produce information, knowledge, and culture is being systematically curtailed in order to secure the economic returns demanded by the manufacturers of the industrial information economy” (p. 16-17). But is a culture that denies the financial rewards due to individual authors by ignoring existing copyright likely to effectively advance the kind of collaborative creativity Benkler, McChesney, and Vaidhyanathan desire? The artist and computer scientist Jaron Lanier has, in his book You Are Not a Gadget, written of the “impenetrable tone deafness [that] rules Silicon Valley when it comes to the idea of authorship” (Kakutani, 2010). The problem is exacerbated by the easy international dissemination of digital cultural products and the consequent clashes between U.S. and other countries’ copyright laws, as illustrated by the recent kerfuffle over recreation of Marcel Duchamp’s chess set. As Library of Congress general reference librarian Thomas Mann (2015) has pointed out, the only alternative to copyright restrictions appears to be “government-regulated control of information,” which carries problems of funding and coercion, so copyright will continue to obtain. It is to be hoped that the rights of visible and invisible individual cultural producers will be respected, as “Changes in technology do not produce changes…in the need to make a living” (p. 134-135).

 

Sources

Benkler, Y. (2006). “Introduction: a moment of opportunity and challenge” in The Wealth of Networks: How Social Production Transforms Markets and Freedom. Yale University Press, 1–18.

Cohen, D. (2015, October 22). What the Google books victory means for readers. The Atlantic. Retrieved from http://www.theatlantic.com/technology/archive/2015/10/what-the-google-books-victory-means-for-readers-and-libraries/411910/

Hiltzik, M. (2015, October 20). Copyright boon or bane? Google Books survives another legal challenge. Lost Angeles Times. Retrieved from http://www.latimes.com/business/hiltzik/la-fi-mh-google-books-survives-another-legal-challenge-20151020-column.html

Hirtle, P. B., Hudson, E. & Kenyon, A. T. (2009) Copyright and Cultural Institutions: Guidelines for Digitization for U.S. Libraries, Archives, and Museums. Cornell University Library.

Kakutani, M. (2010, January 14). A rebel in cyberspace, fighting collectivism (Review of the book You Are Not a Gadget). The New York Times. Retrieved from http://www.nytimes.com/2010/01/15/books/15book.html?pagewanted=all&_r=0

Mann, T. (2015) The Oxford guide to library research. 4th ed. Oxford University Press.

McChesney, R. (2013). Digital Disconnect: How Capitalism is Turning the Internet Against Democracy. New Press. Chapters 3–5.

Moor, J. H. (1985). “What is computer ethics?” Metaphilosophy 16(4): 266–275.

Norton, Q. (2015, September 8). The international fight over Marcel Duchamp’s chess set. The Atlantic. Retrieved from http://www.theatlantic.com/technology/archive/2015/09/the-international-fight-over-marcel-duchamps-chess-set/404248/

Vaidhyanathan, S. (2005). “Critical information studies: a bibliographic manifesto.” Cultural Studies 20(2/3): 292–315.