Confronting Bias and Antiquated Terms in the Catalog

Even in fields that are purported to be objective, an individual’s bias in always present. Knowledge organization structures are no different, constructed as they are by a select few people in power. It is no surprise, then, that the bias inherent cataloging terms have been the subject of debate over the past few decades. This debate is the focus of Emily Drabinski’s article “Queering the Catalog: Queer Theory and the Politics of Correction” (2013), which points to the use of antiquated, often offensive language and subject headings within the dominant cataloging systems. Drabinski makes the point that cataloging systems tell a story about the information they represent, and have told a story informed from a white, heterosexual, Christian, patriarchal perspective. Drabinski highlights a number of efforts over the years to petition groups to change problematic language and groupings, but believes that this approach falls short of its intended goal. Drabinski advocates instead for an approach rooted in queer theory, which rejects the idea of changing the catalog and rather wants to keep the problematic language in order to make the catalog’s bias obvious and apparent to researchers, allowing them to “very quickly understand that catalogs reflect a particular point of view rather than an objective truth.” While this understanding is important for users to have, Drabinski’s reservations are too extreme, and the education she proposes can be accomplished while still changing offensive terminology and subject headings.

The basic function of cataloging is to sort materials into groups in ways that make it easier for users to find what they’re looking for. As cultural perspective changes, it is important for the catalog to reflect those changes simply so materials remain discoverable. If a user wants to find materials on certain topics, they will not be able to find them as easily if they are found under archaic subject headings. A user today would not think to look at materials dealing with homosexuality under “sexual deviance,” nor would they want to. It is somewhat ironic that queer theory applied to cataloging both maintains that identity is fluid and subject to constant change, yet insists on fixing the catalog in its original state. Drabinski explains the dissatisfaction with changing classifications by asserting that “the political focus on correcting classification structure and subject language solidifies the idea that the classification structure is in fact objective and does in fact tell the truth, the core fictions—from a queer perspective—that allow the hegemony of a universalized classification structure to persist.” However, it really seems that the opposite is true. It could well be argued that changes made to the language used in the catalog are admissions of error in the past, and an attempt to make up for those errors. They demonstrate that the structure is subject to reconfiguration, and in some ways document the shifting perspectives over time.

At one point in the article, Drabinski cites an example from an essay written in 1972 by Joan Marshall protesting the use of the word “Mammies” as a subject heading, in which Marshall asks, “Could any of us, without mumbling embarrassed and probably useless apologies, even if we dared, tell a young, militant, Black woman who wanted material on this subject to look under mammies!” This is a valid question, to the point where it seems almost rhetorical. However, Drabinski dismisses that question and instead only comments that the suggested improvement, “Negro women,” would be seen as offensive today, seemingly suggesting that “Mammies” should have been kept as a subject heading. Drabinski appears to see this interaction as a potential opportunity for educating the user on the biases inherent in cataloging, but this presupposes that the user is not so offended that they leave the library and become discouraged with the entire system. What Drabinski sees as an access point could have the potential to damage a user’s experience with the library.

Drabinski puts forth that queer theory applied in this context “challenges the idea that classification and subject language can ever be corrected once and for all,” but it should not be suggested that this language, however many times it is corrected, will always remain correct. Changing terminology is and will always be an ongoing struggle because identities are fluid and shift constantly. That only means, then, that the catalog must constantly be reappraised and changed. This might be accomplished through the communication between librarian and user that Drabinski suggests will occur when people are confronted by terms that offend them – users can voice their complaints, and after discussing the history of the catalog, librarians can proceed to lobby to change the problematic language.

While changing subject headings is important, it must be said that it is in no way a bad idea to also educate users on the fallibility of the catalog. History must not be forgotten or revised, and the existence of these subject headings should certainly be noted and taught. It’s important not to forget that bias, racism, and bigotry informed a great deal of these subject headings, and it’s also important to teach that history to concerned individuals. But does that really mean that we should accept those subject headings and allow them to stay just for the benefit of a discussion which, depending on the user, might not even take place?

These two are not mutually exclusive by any means. Discussions about the catalog can still take place on a personal level and in educational settings so long as the librarian chooses to do so. At the same time, when terms are changed in the catalog, it may be wise to annotate them in some way. It may be possible to indicate the changes in the same space cross references occupy, perhaps similarly to the way a dictionary will provide archaic, out-of-use definitions for words after supplying the modern ones. Something like this may be quite difficult, but this way items can be found under socially acceptable subject headings while still acknowledging the insensitive language previously used. Users can find materials under the current socially acceptable, inoffensive terms and still learn the history of the catalog in context. This could even better accentuate the biases in the catalog, and and invite users to challenge the authority of the catalog and help reshape it.

The catalog is a representation of the library, and while these subject headings give insight into how catalogers viewed the world, they do not represent the current positions and viewpoints of catalogers and librarians today. The issue of language in subject headings is analogous to the same issue in regard to federal and state laws, some of which were modified earlier this year (Kelkar 2016). When New York in 2009 eliminated the term “Oriental” from government documents, then-Governor David Paterson said, “The words we use matter. We in government recognize that what we print in official documents or forms sets an example of what is acceptable” (Chan & Lee 2009). The same can be said for libraries: the words we use in the catalog also set examples of what is acceptable, and it is wrong to present offensive terms as appropriate descriptors.

 

Chan, S., & Lee, J. (2009). Law Bans Use of ‘Oriental’ in State Documents. Retrieved from http://cityroom.blogs.nytimes.com/2009/09/09/law-bans-use-of-oriental-in-state-documents/?_r=1

Drabinski, E. (2013). Queering the Catalog: Queer Theory and the Politics of Correction. The Library Quarterly: Information, Community, Policy, 83 (2), 94-111.

Kelkar, K. (2016, May 22). Obama signs bill eliminating ‘Negro,’ ‘Oriental’ from federal laws. Retrieved from http://www.pbs.org/newshour/rundown/obama-signs-bill-eliminating-negro-spanish-speaking-oriental-from-federal-laws/

 

Observation of Metis Cataloging System

For this post, I observed elementary school students in the Berkeley Carroll Primary School library, where I work as an assistant librarian, interacting with the Metis cataloging system. This is in response to our Week 3 discussion of categorization, particularly Drabinski’s assertion that classification and subject language are inherently broken. While I’m inclined to agree that there will always be flaws in cataloging, I was interested in examining how a user-driven, specialized system might take steps in the right direction.

Metis was developed by librarians in fellow New York independent school Fieldston as a cataloging system tested and honed by children to focus on the needs of young browsers; everything from simple category names (like “Making Stuff” or “Scary”) to the categories themselves (“Animals” and “Pets” are separate sections) has the child audience in mind. There are twenty-six categories, each given a letter of the alphabet and a distinct icon, but the arrangement of these categories varies by school: for instance, while the Graphic Novel section’s suggested placement is between U (Scary) and W (Memoir), Berkeley Carroll assigns this popular section down in Z to prevent its large reader base from crowding away prose readers. Subcategories are also at local discretion, allowing students and librarians to further cater to the interest of its specific audience; this is why Berkeley Carroll, which has a major unit on ocean life in third grade, has a specialized “Marine Animals” subcategory that fills a full half of the “Animals” category.

The library I observed is the size of a small classroom, and contains roughly six thousand books. It’s one of two “hubs” in the primary school building: the “Red Hub” one floor below is largely for first- and second-graders, while my “Yellow Hub” covers third and fourth, but any student is free to use any hub. Here’s a visual of how the Metis system labels book spines: students can see the assigned letter, section, and subsection, with authors not necessarily given focus.

My observation consisted of three hour-long periods of peak usage (11:45 through 12:45) over three days in the Yellow Hub. The most notable trend is the marked distinction between third- and fourth-grade browsers: third-graders are generally new to the Yellow Hub and more often ask for help locating books, while fourth-graders easily find their sections of interest. The Metis system still requires some instruction to interact with, but I take the ease with which fourth-graders search and third-graders learn as a sign of its effectiveness.

More specifically, my assessment of Metis’s intuitive nature was bolstered when two third-graders who asked for help finding books on day one were searching independently by day two or three. A fourth-grader, in recommending a book to her friend, showed her where it was in the Fantasy section, then brought her to Tales (containing mythology) and Sci-Fi to suggest further reading. When students ask for recommendation, they virtually always use the same language as Metis, specifying that they want a “scary book” or a “realistic book.” There’s a chicken-and-egg conundrum here, particularly for students more accustomed to the system: is Metis’s language capturing how these children self-categorize, or have they merely adapted to the jargon? I’m inclined to go with the latter, but regardless, it’s clear that they navigate Metis far more easily than I did in my elementary school’s Dewey Decimal library.

While Metis’s positive qualities above hold true, it’s clear even in this basic observation that its localized customization not a cure-all salve. Many students of both grades still consistently ask where certain books are located without attempting to search; while this may say more about the convenience of a librarian than the difficulty of searching Metis, it still proves that the system remains inferior to a human search engine. That much is obvious, as categories are artificial constructs that users must learn, and even a universal, simply-taught cataloging system—among its myriad problems (read: Drabinski)—can’t take something as basic as room structure into account. There’s bound to be a learning curve in every library, and the fastest option for finding books will practically always be a librarian.

My most relevant observation about the failings of Metis was in a student’s comment that Redwall, a series about warring woodland creatures, is located in T (Adventure), while Warriors, a series about warring cats, is in V (Animal Fiction). While Redwall’s animals are far more anthropomorphic than their Warriors counterparts, wearing clothes and bearing weapons and standing on two legs, this did not convince the student that the two should be separate. I’d like to say that the student-based Metis system called for a change, or a larger inquiry into the matter with more students weighing in, but the hassle of such a shift (Redwall is a physically massive series) prohibited any section changes. The ideal behind Metis is noble, but in reality it’s impossible to fulfill every demand, even understandable ones like this with little argument to be had. Perhaps if a browser’s only option for locating a book was independent searching, there would be more of an effort to further perfect cataloging, but again, a librarian on location mitigates the problem.

Still, the appeal of a specialized catalog is self-evident; students who do opt to browse can easily find what they’re looking for when the system speaks in their language, and issues like the Redwall/Warriors incident are hardly limited to Metis. There will never be a complete solution to the intrinsic flaws of cataloging, but ditching a universal standard like Dewey for a library-by-library approach, using categories and language tailored to the population of local readers, seems to be a step in the right direction.

Stray observations:

  • The population I observed is obviously limited to Berkeley Carroll students, which is an deeply imperfect sample of children their age. Its small student population and a focus on independent learning (students, for instance, use a self-checkout service) is hardly the norm in American or global schools, nor is the price tag; even in an observation this basic, we should take the results with a grain of salt.
  • As mentioned earlier, graphic novels are easily the most browsed section on the library, to the point where checkouts are limited to one graphic novel at a time (the book limit is normally two per day, and five total books checked out at a time). With the gradual acceptance of the medium’s value and the explosion of talent being published by imprints like First Second and Papercutz, I can imagine a future where they’re integrated with works of prose due to the sheer size a Graphics section would take up.
  • Redwall is so much better than Warriors it’s not even funny.

References:

Emily Drabinksi, “Queering the catalog: queer theory and the politics of correction”

Addressing the Benefits and Limitations Of Traditional and New Methods of Research

When attempting to understand the world around us, we begin by asking a simple question. Research becomes our response to answering those questions through methods and tools available. As information sources and technology have developed, access to that information has broadened. The event I attended provided me with the knowledge that our answers are not always in the places we look first.

Digital Art History: New Tools, New Methods focused on the development at the Frick of their Digital Art History Lab (DAHL). Hosted by the New York Chapter of the Art Libraries Society of North America (ARLIS/NA/NY) at the Metropolitan Museum of Art. ARLIS/NA is a non-profit organization created by art librarians in 1972. The organization addressed the lack of communication with the field of art libraries. Today ARLIS connects art librarians, and those interested in the field, through “programs designed to provide members with introductions to new technologies, new cultural institutions and to current artistic activities.” Digital Art History has been the most recent development in how technology helps answer established questions.

With the recent developments at the DAHL staff members of the Frick discussed their developments in the world of Digital Art History. The talk focused on distinct software systems and methodologies that could aid our own personal research. The key points that stuck out to me were discussed by Dr. Louisa Wood Ruby aided by Samantha Deutch on new developments in photoarchives, the work of Dr. Titia Hulst and her use of innovative methodologies and finally the work by Ellen Prokop’s work with GIS. I would like to make a connection with these new methodologies to PERCS steps in “The Methods of Field Work,” and how they can relate to the technological advances outside of the social sciences [1].

Dr. Louisa Wood Ruby is the head of Photoarchive Research at the Frick. Working with a group of international photoarchivists they created PHAROS, an art research database. PHAROS is still a work in progress but the public has minimal access to what is already done. The goal of PHAROS is to make resources available to researchers and institutions to find lost copies of masterpieces, including the finding of previously unattributed work.The database will have collections from North America, Europe, Latin America and Asia. The range of material will be unique to this software because of the materials from western and non-western cultures. The Frick uses PHAROS to reorganize their photo collection by consolidating misplaced copies. In relation to PERCS step 30 we as researchers ask, “Do we have a responsibility to choose a venue of publication that will speak more directly to the participant community?[2]” I believe PHAROS will create a responsibility of sharing with researchers and large institutions collections for studies never conceptualized in the past due to lack of informational resources.

Another new resource of information implemented by Dr. Ruby has been ARIES (Art Image Exploration Space) with the aid of Samantha Deutch, the Assistant Director of the Center for the History of Collecting at the Frick. DAHL, along with NYU’s computer science department, created ARIES as a new tool for image analysis. ARIES allows art historians to implement technology into long standing practices like comparing and contrasting attributes. Through ARIES a researcher can find previously unknown works with the ability to manipulate images to prove a connection to a masterpiece. PERCS step 21 addresses the issues of the changing conditions in research and how to maintain promises, or stated truths[3]. With new technologies available to aid in research, many of those stated truths can no longer be considered unquestionable. Debated theoretical facts of the past can now be questioned and put to the test. ARIES can aid the researcher in diving deeper into their own curiosities to prove new theories.

An interesting software system that was introduced to the digital art history world, at this event, was Cytoscape, presented by Dr. Titia Hulst. It became clear that Cytoscape was initially created for large data collecting lab science projects, like in biology, to envision their microscopic entities as a network of connections through imagery. Dr. Hulst used Cytoscape for a large amount of data gathered about art dealers and collectors in New York City during the 1960’s. With such a range of material collected on a grand scale, finding a connection would be like trying to find a needle in a haystack. PERCS step 12 stated that research “often involve[s] taking knowledge from one community for the use by another,” in this case using software from biology for investigating aspects of art history[4]. The most interesting part of Dr. Hulsts’ study that I took away was realizing how art historians have evolved to use technology to help maintain being visual learners. Cytoscape allows for this way of learning by querying data tables to create connections overlooked and unimagined until displayed as an concrete digital image.

Another technology that aids in art historical visual learning is the practical use of GIS (Geographic Information System). Presented by Ellen Prokop the Associate Photoarchivist at the Frick’s Reference Library, Prokop introduces how GIS can help answer questions but also posing new questions to ask. Like PERCS states in step 11 you need to find your motivation for doing the work because we are inherently curious and want to fulfill that curiosity[5]. To focus a study through a period eye and understanding GIS can be used to recreate a space back in time. The project Prokop focused on was the influence of El Greco on artists of 19th century Paris, like Cezanne. GIS maps were overlapped with use of todays map of Paris with one from the late 19th century along with data queried to focus her search. Prokop made a connection that counteracted the idea that El Greco was the father of modern Parisian art. She noticed, through the layered maps that the works by El Greco seen by the public were forgeries and the real ones were on limited view within Paris at the time. While GIS is typically used for archeological research, art historians have found a way to use the software to develop questions and find information hidden within the maps that we now can use to understand an art form through a historical lens.

Due to art historical research done through GIS, ARIES and PHAROS, inadequate questions can be satisfied through new information resources previously unused. In modern society databases and/or software systems wouldn’t have been possible without the collaboration of art historians, librarians, computer scientists, and lab scientists to evolve research capabilities unimagined 20 years ago.

[1] Elon University. Program for Ethnographic Research & Community Studies. The ethics of fieldwork module. Retrieved from http://www.elon.edu/docs/e-web/org/ percs/EthicsModuleforWeb.pdf

[2] Elon Univeristy, 12.

[3] Elon University, 9.

[4] Elon University,6.

[5] Elon University, 5.

References

https://www.arlisna.org/about/history

http://images.pharosartresearch.org

https://ukiyo-e.org/about

http://www.cytoscape.org

http://www.esri.com/what-is-gis

http://www.frick.org/research/DAHL/projects

Conflicts with Cataloguing Structures

Emily Drabinski’s article, Queering the Catalog: Queer Theory and the Politics of Correction, demonstrates the challenges presented by our desires to open and unlock the classification and cataloging systems within library structures. Since the late 1960s, scholars and professionals of information studies have challenged the neutrality of the Library of Congress’s traditional classifications and subject headings, demanding that vocabularies be corrected to reflect current social and political contexts. While specific classification and cataloging decisions in library structures have been “fixed”, Drabinski’s queer theory demonstrates that any corrections made are only conditional and never final.

 

Libraries are stable spaces, controlled through traditional library classification structures and vocabularies systems that provide standards and guides for both producers and consumers of information. Consequentially, the static nature of the library makes them resistant to change. As Drabinski argues, this is problematic since libraries are dependent on language. Language transforms over time, it is adapted into new contexts and given new meanings. The information acquired through libraries are therefore organized and identified through classifications and subject headings that become socially and politically incorrect over time. More simply, information and materials within libraries end up being misrepresented.

 

The root of this problem steams from the static quality of hegemonic library classification and cataloging systems. In order to combat this misrepresentation, Drabinski considers continuous revisions and additions to the library’s classifications and subject headings necessary. While she acknowledges that such corrections are adequate, they conform to the hierarchical power structures within the library’s catalog. If we break down this system we can identify that the cataloguer that originally classified and catalogued a material within the organizational system, the critical cataloguer that requested the revision, and the Library of Congress which judges if classifications and subject headings are suitable, all hold significant hegemony in how information is represented. In order to compensate for our inability to dismantle this hierarchy, Drabinski asserts that librarians and catalogers should open and engage in discourse with users on the limitations of our cataloging systems. However, this response is not sufficient. Library’s may not have enough staff as well as resources to fully dive into its specific cataloging and subject heading issues. Users may not seek out library professions to voice their concern, or even have the luxury of time to listen to the history and reasons for the library’s current system. While Drabinski continues to approach the issues of hegemonic cataloging systems head on, I suggest we incorporate a sideways approach.

 

The purpose of forming knowledge organizations and structures within libraries is to enable both producers and consumers of information to navigate and access quality sources of information. How rich and extensive the records are in describing the various materials within the library, will determine how much quality information is communicated. As Christine Pawley states in her article, Information Literacy: A Contradictory Coupling, “The decisions that indexers, catalogers, and classifiers make in providing intellectual access to the contents of books and articles through subject headings, and index terms, and physically or virtually allocating works to particulars areas of the library collection, contribute to the ways in which researchers think” (Pawley, 2003). Pawley recognizes the production of accessible knowledge does not end at the physical or virtual library shelf, nor does it move in one linear direction. It is a process that continues to recontextualize sources, perpetually moving, connecting, and growing. Rather than remain within the confinements of the controlled cataloging structures, we should widen and loosen our perspective. As Pawley notes of Hope Olson’s argument, we must relinquish control and create openings within these structures so that power can leak out as well as in. Therefore when forming classifications and subject headings we cannot use what Ross Todd identifies as a “one-size-fits-all” approach. We must engage in a more critical and collaborative approach that considers all aspects of a source – its content, the context of its production as well as the author, its history (specific to the material item and larger picture), its relationship to works it was inspired by and the ones it inspired. Thus, as these facets change and evolve with time we must continue to engage in this process of reformation and discourse. Our classification systems should always be in flux, evolving, and changing relationships with one another.

 

While this method may be too much work for libraries to continuously manage as well as financially burdening, especially if they have large collections, such a model does exist and has been quite successful in dissolving the rigid structures of our current cataloging system. Artsy is an online art collection that is curated by its own classification system and technological framework, called the “The Art Genome Project”. The Art Genome Project maps characteristics – “genes” – that connect artists, artworks, architecture, and design objects through history, and currently, over 1,000 genes exist within this project. While this system is similar with tagging and mapping local vocabularies concepts, Artsy’s genes are more firmly rooted and cohesive. Artsy’s 1,000+ characteristics are weighted proportionally to one another. For example, categories within the Art Genome Project are displayed as complete list and are organized numerically and alphabetically. Therefore within “B”, we find “Bauhaus” (an artistic movement) is listed below “Bathers” (a subject found within artworks). If we closely examine the art that relates to the gene “Bathers”, Artsy provides a description of this subject matter and its larger history, a searchable list of artworks that contain this subject, as well as a list of related categories and artists. This structure enables users to access and obtain information through this web of related knowledge. Additionally, Artsy’s widen structured approach also allows for collaboration with Artists, Galleries, Museums, Auction Houses, Scholars and Institutions, as well as many others. Such collaboration and discourse ensures that the information within Artsy’s gene web is of quality and its information is accessible. Furthermore, as Artsy collaborates with other leaders in the field, it is continuously acquiring new artworks and information, adding new genes, and restructuring relationships. Although Artsy is a virtual collection, I believe that we can apply the same techniques within the physical spaces of the library. As seen with Artsy, the actual space where the work resides is not crucial. Rather, what is important is the ways in which information is represented within these webs and consequently communicated back to users.

 

Drabinski, Emily. “Queering the Catalog: Queer Theory and the Politics of Correction.” The Library Quartely: Information Community, Policy Vol. 83, No. 2 (April 2013), pp. 94-111. http://www.jstor.org/stable/10.1086/669547

Pawley, Christine. “Information Literacy: A Contradictory Coupling.” The Library Quarterly: Information, Community, Policy, Vol. 73, No. 4 (Oct., 2003), pp. 422-452. http://www.jstor.org/stable/4309685

Rosenzweig, R. (1991). “Politics and anti-politics in librarianship.” in ibid The Progressive Librarian No. 3 (Summer 1991) pp. 2-8. http://www.progressivelibrariansguild.org/PL_Jnl/pdf/PL3_summer1991.pdf

Artsy – The Art Genome Project. https://www.artsy.net/categories

Co-evolution of Humanity and Technology

As much I enjoy reading Norman’s thoughts on the co-evolution of humanity and technology in his article from The Invisible Computer. I find his views on the technological impact on humanity to be a bit pessimistic. He sees technology and machines as some foreign entities that are beyond our control while overlooks many aspects of technology such as its potential as a tool for creation and knowledge transfer.

While Norman makes a good point suggesting people are” forgetful of details, with a poor sense of time, a poor memory for facts and figures, unable to keep attention on a topic for more than a short duration, reasoning by example rather than by logic, and drawing upon our admittedly deficient memories of prior experience.“ I think comparing humans to analog is a questionable analogy. Analog technology was created for the purpose of storing and reproducing information in a systematic way. It is as much machine as any modern technology, but dated. Calling ourselves analog is suggesting we are still living in the past. In Norman’s view, ”people are analog, insensitive to noise, insensitive to error. People extract meanings, and as long as the meanings are unchanged, the details of the signals do not matter. They are not noticed, they are not remembered” I find it hard to agree with this statement. While we do have high tolerance for errors, we are sensitive to noise, sensitive to error. Our accumulated knowledge and experience have taught us noise is detrimental to decision-making. We spend tremendous amount of time and effort to minimize errors through documenting, reproducing and examining data to identify patterns. With the help of technology, we are constantly learning from our mistakes and trying to make sense of world through analyzing new and historic information.

Norman points out that “human beings are the results of millions of years of evolution, where the guiding principle was survival of the species, not efficient, algorithmic computation.“ Norman would be right if we still live in prehistoric times as hunters and gatherers. When people started forming societies that were organized around agriculture and institutions, our priorities shifted. As population grow so does our need for stability and predictability. Our obsession with efficiency and predictability can be traced back to our need to make better forecast to increase food production in order to sustain an ever-growing population. Our obsession with stability is needed for governance and establishing orders. Human beings have “co-evolved with social interaction, cooperation and rivalry, and communication”. Society would not have thrived and progressed without stability and improved efficiency in tools making and resource utilization made possible through technological progress.

Norman points out that technological progress and our obsession of efficiency in production have reduced us to just machines in an assembly line “hence too came the dehumanization of the worker, for now the worker was essentially just another machine in the factory, analyzed like one, treated like one, and asked not to think on the job, for thinking slowed down the action.” While this is somewhat true, technological progress has the potential to save us from dehumanization by automating low skill jobs and give us more time to focus on creative tasks that require cognitive skills. Although there have been numerous debates on the economic implication of job displacement by technology, improved efficiency through automation have make goods and services more affordable and accessible. Advances in technology have also created many job opportunities for the creative industry and information professionals. Norman points out the issue that technology has moved so fast that we are unable to keep up. “The slow evolutionary pace of life is no longer up to the scale and pace of technological change. The accumulation of knowledge is enormous, for it increases with every passing year. Once upon a time, a few years of schooling — or even informal learning — was sufficient. Today, formal schooling is required, and the demands upon it continually increase.”  While I believe Norman’s statement resonates with many of us who are always in pursuit of new knowledge to stay competitive in the world. I feel that knowledge does not accumulate perpetually. Knowledge becomes obsolete as we find better ways of doing things. For example, while it is helpful to understand machine codes, not a lot of software engineers use machine codes for programming. New knowledge supersedes old knowledge. Whatever knowledge we find relevant today may not be relevant a decade later. Our pursuit of knowledge can go as far back as the prehistoric times when we sought ways to identify weather pattern and better farming techniques. Technological advances facilitate the transfer of information to help us stay informed of nascent and relevant knowledge. There is no shortage of vast libraries of digital information and self-guided online education. The sufficiency of education is subjective and highly dependent on individual need. It is up to our individual choices to decide whether to take advantage and adapt to the ever-changing world.

Norman brings up some interesting points but I find his views to be a bit dated. I do agree with the fact that technology needs to be created in a way that should complement us, but I find it questionable that “we are compliant, flexible, tolerant. Yet we people have constructed a world of machines that requires us to be rigid, fixed, intolerant.” Machines do not require us to be rigid and intolerant of errors. Machines are programmable and follow rules set by humans. Machines are as flexible as we build them to be. The way machines are built is a reflection of our capabilities in applying knowledge to build tools to advance our cause. With advances in digital technology, electronic devices have become portable and computer processors have become much more powerful. The costs of producing and storing information become much cheaper and access to information has become much easier. Without accuracy and precision, much of technological progress that we have come to appreciate would not have existed today. Although “digital signals are limited in values”, they have enabled much creativity and information freedom. As complex as computing devices have become, they are still largely single-purpose tools that cannot make decisions and only capable of performing tasks in repetition. Machines are tools that help us to create better tools. While I do agree with the fact that “We have constructed a world of machinery in which accuracy and precision matter. Time matters. Names, dates, facts, and figures matter. Accurate memory matters. Details matter.” I don’t think we have forgotten we are still good at experimenting and inventing through trials and errors.

 

Norman, D. A. (1998). The Invisible Computer: Why Good Products Can Fail, the Personal Computer is So Complex, and Information Appliances are the Solution. MIT Press. Chapter 7: Being Analog

Trickle-down Information: The Enlightenment Model and Information Dissemination in the Modern Library

Note: I believe this subject has the potential for expansion and further investigation. Any feedback, criticism, and questioning would be greatly appreciated as I am considering expanding this essay into a full research topic.

The Library is an establishment intended for the dissemination of information, the modern foundation of which is historically rooted in the age of Enlightenment. As literacy and readership increased, foundations of knowledge and governing bodies began to invest in the construction and design of libraries. [1] The intellectual and wealthy elite of the enlightenment age spurred these modes of knowledge delivery, placing themselves as creators and controllers of information. The library and university were established as a means to circulate created information based on a top-down structure. At one point, this was highly restricted in terms of access, often denying women, people of color, and those in poverty. [2] Today, these are not strictly enforced laws of conduct but the established system continues to place the same types of people at a disadvantage.

Many critics note the power dynamics established in the creation and distribution of knowledge based on the Enlightenment model. The distribution of information from the creator to the consumer continues to enforce this model of dissemination and the related top-down power structure. [3] The researcher, the student, and the public library patron are only able to access the resources their institution can afford or will allow. Libraries emphasize obtaining and providing collections that will meet the needs and expectations of their community. However, the community, as consumers, is not in a position to greatly influence the collection and distribution of information.

The Digital Age is believed to provide greater opportunity for the process of disseminating information; however, most scholarly articles are only available through glass walls. The practice of open access is not a solution to inaccessibility since publishers and institutions often hold most republication rights to any scholarly production. “Library access to electronic resources is another widely acknowledged economic barrier.” [4] Classification and distribution reinforces information as a commodity available for commercialization. [5] Copyright holders limit distribution to specific journals, repositories, and databases. The biggest databases, often with the most diverse amount of publications, are only accessible through educational institutions, including libraries. The consumer is dependent on what institutions they may access and what that institution chooses to make available.

Furthermore, laws such as the Stop Online Privacy Act (SOPA), Protect IP Act (PIPA), and the Research Works Act have often run the risk of further hindering an open access system of information. [6] Opponents to open access often view information as a risk in the wrong hands. Peter Schmidt of The Chronicle of Higher Education criticizes the potential for “the publication of inferior and unreliable journals” and “the risk that research in fields such as medicine will fall into the hands of people who might misuse it.” [7]  Although these bills have not reached the point of becoming law, their proponents echo the power structures and control of information exemplified by the Enlightenment Age.

The Library places great emphasis on obtaining and distributing materials of authority. We continue to see institutions of knowledge, universities and bodies of government, as the authority on particular forms of information. Information produced and distributed through these institutions is considered the voice of scholarly authority. Minority groups are often underrepresented in academic institutions, and sometimes banned from shelves and curriculum. [8]  The continued movements toward open access creates new opportunities for equitable information distribution. In a consumer-based society, it’s not surprising that information is treated as a commodity for trade. Publishers and institutions manage how users access information by selecting exclusive databases to allow distribution. The duty of the modern library is to move away from a neutral stance and defend accessibility, free speech, and the freedom of information. The Library as a disseminator is the door between the creator and consumer. The ethical librarian should provide open access that will benefit and improve the lives of library patrons. The Library, as an institution of authority, should be the voice of dissent toward political campaigns aimed to restrict information access. [9] The dissemination of information via a top-down power structure places those at the bottom under a significant disadvantage. The purchase and exchange of information is designed to benefit the publisher and the distributor, enforcing their authority as the all-knowing-elite. The modern Library holds an institutional responsibility to involve the consumer in the process of information dissemination, providing greater opportunity for information creation and understanding.

 

References

  1. Dahlkild, N. (2011). The Emergence and Challenge of the Modern Library Building: Ideal Types, Model Libraries, and Guidelines, from the Enlightenment to the Experience Economy. Library Trends, 60(1), 11-42.
  2. Pawley, C. (2003, October). Information Literacy: A Contradictory Coupling. The Library Quarterly, 73(4), 422-452.
  3. Ibid.
  4. Pribesh, S., Gavigan, K., & Dickinson, G. (2011). The Access Gap: Poverty and Characteristics of School Library Media Centers. The Library Quarterly, 81(2), 143-160.
  5. Pawley, C. (2003, October). Information Literacy: A Contradictory Coupling. The Library Quarterly, 73(4), 422-452.
  6. Chadwick, R. (2012, December). Protecting Open Access to Taxpayer-Funded Research: The Rise and Defeat of the Research Works Act. The Serials Librarian, 63(3-4), 296-304.
  7. Schmidt, P. (2010, February 14). New Journals, Free Online, Let Scholars Speak Out. from http://www.chronicle.com/article/open-access-journals-break/64143
  8. Reichman, H. (2012, March). Opposition grows to Tucson book removals and ethnic studies ban. Newsletter on Intellectual Freedom, 61, 1-84.
  9. Rosenzweig, R. (1991). Politics and anti-politics in librarianship. Progressive Librarian, 5–8. http://www.progressivelibrariansguild.org/PL_Jnl/pdf/PL3_summer1991.pdf

Librarianship for Social Justice

Personal note: in this blog post, I am trying to think my way through an issue on which I know I need to educate myself more. I am white, with a legacy that includes Southern slaveholders on my father’s side and German Nazis on my mother’s. It is my intention not to center Black Lives Matter around white people or the predominantly white professional fields discussed here, nor to suggest that White Saviors can step in to fix things, nor to pass the buck of responsibility to black activists, but instead to develop some kind of context for using this library degree in a transformative way. I don’t know if I’ve done this well, but I hope it’s better than not addressing the question at all. Continue reading

The Neutrality Illusion and How to Combat it

Robert Jensen brings up an interesting point in his article, “The Myth of the Neutral Professional” from 2004 when he states that an intellectual in any society is not neutral. Intellectual Professionals, such as librarians, serve a function; that function is to solidify the position of the elite. They do this by validating what they choose as important for the masses. Jensen talks about how librarians take on the agenda of the elite through things like acquisitions and programing, but something he does not acknowledge is the tagging system which also confirms the agenda of the elite. Librarians are the gatekeepers of information. Today, patrons have access to sources not kept by librarians for almost any information they like, however, the most valid source of intellectual information is still housed in some form of library. Libraries get their funding from somewhere, which makes them some form of extension of the elite as well. A library may house many voices, but a higher structure chooses those voices. Accessibility has changed how patrons interact with information. Librarians can use this to create a more open library system, and acknowledges its bias.
Intellectuals cannot ignore the interconnectedness of institutions in the United States. Institutional libraries do not stand alone in a web of power structures. A government unit of some kind does fund them. By extension, the rich and powerful elite, to some extent, control said government units. Libraries extend much farther than just career academics and intellectual professionals, especially academic libraries. Today the average millennial has to go to college to be financially secure; therefore the impact of an academic library reaches into more minds than ever before. The impact of so many people having their own perspectives in the social sciences could alter the future of how Americans think. The question is, with so many sources for information accessible, how will the average American react?
Just because there is an option for someone to verse themselves in new ideas, does not mean that, they will not simply narrow their field of view in order to focus on what matters to them. Whether to embrace knowing a little bit about everything, or accept that knowing everything about one thing is impossible seems to be the intellectual conundrum of the 21st century. I feel that in this paradox is where the excuse of neutrality is most dangerous. The idea of neutrality allows for those desiring to narrow their field of view to continue to do so without recognition of the bias they are gaining. By not advocating for new voices, libraries can enable this behavior, “[…] to take no explicit position by claiming to be neutral is also a political choice, particularly when one is given the resources that make it easy to evaluate the consequences of that distribution of power and potentially affect its distribution.” (Jensen, 2004) If you look at the structure of cataloging there is a particular field where this distribution of power is transparent: tagging. In the tags field, the goal is to describe a book in key words, findable to the patron. In a sense the librarian has freedom to tag something as whatever they like, but at the same time that person is limited to the acceptable “neutrality” where they must tag the item with accepted terms recognized by society as associated with the object. Using conventional tags for these materials is good for someone seeking out that information. But it limits the ability for someone to stumble upon this material, exposing them to something new or a new viewpoint on the subject matter. If it became the convention to tag things as related to a field that oppose it, or give a new view on it; less direct tagging, could be a solution to this small scale interest situation. The internalization of people is something that should be acknowledged by the intellectual professional, as well as their own biases. Another solution can be to add a new field to the tagging system recognizing the source’s lens before interacting with the source.
For example, if someone has limited themselves to knowing only about the issue of deforestation of the Amazon, they might limit their keyword search to “Deforestation” and “Amazon” which will educate them on that specific topic. The materials that person gains access to could include animals placed on extinction lists because of the deforestation, active parties causing the deforestation, and what governments might be doing to stop it. On WorldCat there is a field where that person can limit further by ‘Topic’. They can look at their subject of interest through a sociological lens, agricultural, anthropological, and many more. This field is the best solution one can find to the lack of neutrality in the library field. There are still limited available sources about the ‘medicine’ topic as a lens on the subject of the deforestation of the Amazon (one to be exact) but the patron can recognize a different lens on the same subject they have interest in.
The concept of neutrality in a library setting is an excuse for legitimacy at best. It needs to be clear to a patron that there are necessary biases involved when dealing with a body of information, whether that be in a physical library or when accessing an online catalog. As library professionals there are steps we can take to identify our catalog’s limits that will create transparency with patrons. Informing the public that they are exposing themselves to a limited collection of viewpoints at any given time could make that person more open to new voices. It may help that person realize that there will always be another way to view something, which is the true issue of the neutrality illusion; it creates an authority in something that can only honestly claim to be a small collection of intellectual thought.

Jensen, R. (2004). The Myth of the Neutral Librarian. Progressive Librarian, 28-34.

Information Deserts

Access to information is widely viewed as a core principle of democratic society. But what if there are populations who don’t know how to find what they need, or even know that it is available to them? This thought occurred to me as I read Chapter 1 of “The Wealth of Networks” by Yochai Benkler. Benkler, an optimist who believes deeply in the potential power of the internet as a force for good, argues that “From a more substantive and global perspective focused on human development, the freedom to use basic resources and capabilities allows improved participation in the production of information and information-dependent components of human development.” [1] While this is almost certainly true, Benkler’s reasoning relies on the assumption that potential users (and producers) of information know how to access and use it.

As we discussed this topic in class, I thought of the library in my neighborhood, the people who use it, and what they might use it for. The library, obviously, houses a wealth of information, and also provides practical services like help with becoming a citizen and registering to vote. But how do people learn how to access that information? How do people even know where their library is? What if they don’t have one in their neighborhood, or town? I believe that, in fact, there may be vast “information deserts” here in our own city, as well as around this country and the world, where most people are not able to access the resources that are, in theory, available to them.

The idea of an “information desert” is based on the “food desert” concept, defined by the USDA as “…parts of the country vapid of fresh fruit, vegetables, and other healthful whole foods, usually found in impoverished areas…largely due to a lack of grocery stores, farmers’ markets, and healthy food providers.” [2] An information desert, therefore, might refer to both geographic areas without libraries or perhaps internet access, as well as groups of people – the elderly, possibly, or non-English speakers, or people without cell phones or home computers – lacking the ability to access available resources.  

A specific example of the latter concept is discussed by Jeff Cohen in his 2013 article, “Living in a College Information Desert.” Cohen responds to a piece in the New York Times, “Better Colleges Failing to Lure Talented Poor,” which highlights a disturbing statistic: “Only 34 percent of high-achieving high school seniors in the bottom fourth of income distribution attended any one of the country’s 238 most selective colleges.” [3] Cohen argues that “This phenomenon is largely due to a lack of information and access to cultural capital (i.e., knowledge about college and the associated application and financial aid processes)” and that “there are entire neighborhoods and even regions where nobody knows about or has attended selective colleges or, more importantly, that there are meaningful differences between the colleges that one might attend with respect to support, learning environments and graduation rates.” [4]

The effects of this situation are far-reaching. As the Times article points out, the graduation rate for low-income students attending local colleges is only 50 percent, versus 89 percent at selective colleges. [5] This fact alone limits the future prospects of these students, without factoring in that graduates of selective colleges will likely have better job opportunities than those who graduate from local colleges. When high-achieving students don’t attend universities with high academic standards, they are denied opportunities for success – and the world is denied their potential contribution.

The Times article suggests that the onus is on universities to address this issue. [6] Cohen has a number of suggestions, including funding more college counselors and programs that bring graduates from selective colleges to high schools in low-income communities. [7] I think a combination of efforts could, in this case, have a significant effect. I also think there is a role for the government, especially in ensuring that all public high school students know how to apply for financial aid (which may open up more possibilities for them).

More broadly, information deserts affect a variety of populations (but especially those in low-income communities). How, for example, do the unemployed search for jobs? If one has a home computer with internet access, we might say that it’s easy enough to use employment websites. But what if one doesn’t have a computer or internet at home? They can certainly use the library. But what if their community doesn’t have a library, or it’s too far or difficult to reach? This limits their options to a very narrow scope. (And even if they do have internet access, we are assuming that they know what sites to use and how to use them; we assume that they know how to write a resume and cover letter, etc; this is a different kind of information desert, perhaps – an information literacy desert.)

Benkler’s fantasy of the internet as a great equalizer has merit. But we still live in a time when not everyone can access the internet, and not all of those who can know how to use it to their advantage. This will surely change organically over time as our culture becomes more and more “plugged in.” But in the meantime, we must work to ensure that all populations have ways of accessing information that is critical to their lives. This may mean bringing computers into senior centers; providing free wifi in public spaces; advertising campaigns advising people as to where they can find information they need; and any number of other case-specific solutions. Awareness of the issue is the first step towards finding a remedy.

[1] Benkler, Y. (2006). “Introduction: a moment of opportunity and challenge” in The Wealth of Networks: How Social Production Transforms Markets and Freedom. Yale University Press, 1–18.

[2] http://americannutritionassociation.org/newsletter/usda-defines-food-deserts

[3] Leonhardt, Dave. “Better Colleges Failing to Lure Talented Poor.” The New York Times, March 16, 2013.

[4] https://www.fsg.org/blog/living-college-information-desert

[5] Leonhardt, Dave. “Better Colleges Failing to Lure Talented Poor.” The New York Times, March 16, 2013.

[6] Ibid.

[7] https://www.fsg.org/blog/living-college-information-desert

Can Our Stance Toward Facebook Be Critical Enough

 

“Information, knowledge and culture are central to human freedom. How they are produced and exchanged in our society critically affects the way we see the state of the world.” (1). In our electronic world, a huge amount of our information comes from the Internet, and the production of that information is a complex thing, which requires, from the consumer a critical approach. Benkler is very optimistic about the way we use the public Internet space to inform our reading of this information.  This shared public space allows for many different voices to be heard, and Benkler claims that because so much information is available, there emerges ” a more critical and self-reflective culture (Benkler, pg 15) Whether this optimism is justified or not is another matter.

Recent issues with Facebook over trending topics and the its algorithms that monitor site content both support and challenge the idea of the user’s critical view of information from Facebook. How important is Facebook’s influence.  The NY times says Facebook has 1.71 billion members worldwide, and half of American adults get their news from Facebook.(2) This is a new narrowing of a news source. Certainly no single newspaper or single TV news show is relied upon by half the adult population for news. That many people relying on Facebook as their single source for information means everything that Facebook does matters.

Kincheloe talks about electronic media as providing us with a “secondhand culture, filtered and preformed in the marketplace and constantly communicated via cultural and mass media.(3) This is what Facebook is doing in a very concentrated way, feeding us our culture and information, “filtered” as it chooses. Giving us a filtered monoculture.

In May 2016, The Wall Street Journal wrote about bias in trending topics posted by Facebook. They claimed that conservative news is down played, and liberal news stories are chosen and emphasized. Facebook replied that these stories are chosen through algorithms, and so are neutral, and that guidelines are “in place to support consistency and neutrality. (4)

While Facebook claimed neutrality, nonetheless it responded to accusations of bias by changing its in house training program. In June 2016, Facebook included political bias in its standard training sessions, beyond racial, gender, etc bias.(5)

In September Facebook removed a 1972 Vietnam War photograph (that had a depiction of a naked child) from its site. However member response, (thousands of people reposted the picture) was such that Facebook reinstated the picture on its site. In both these instances Facebook responded to user concerns.

What we see here is a constant back and forth between members and Facebook, in a fight over the control of the content.  Members are not passive recipients of filtered information. They do question what they are seeing and reading. This supports Benkler’s idea that “individuals are less susceptible to manipulation by a legally defined class of others- the owners of communications infrastructure and media” (Benkler pg 9) and that there is not control that is gained once, but “hegemonic consent is always in flux.” (Kincheloe, pg. 93.)

However, these two examples, of member push back, what do they really mean? We have two examples of Facebook changing its content because of outside input. Are these two examples exceptions, instances where the members are critical and aware that they are being manipulated?  Is the norm the opposite? Is it true that most of the time Facebook’s readers are not critical?  The bias in trending topics was reported by a former employee (a whistle blower) not by a user. Only once that it was exposed did members push back. Would even the most critical of readers have noticed a bias? Or would the members be more likely to think that just because it was on Facebook it was authoritative?  To draw a parallel, “The information encapsulated in an article stands alone, authoritative by virtue only of its presence in the volume. Legitimacy is conferred by its place on the Library shelf, (6), or to paraphrase, news on Facebook is legitimized by being there. How many users will be able to take the critical stance described by Kincheloe: “critical hermeneutics traces the ways the cultural dynamics (of popular media) position audiences politically in ways that not only shape their political beliefs, but formulate their identities.” pg 103.? How able are we to “trace” the way Facebook “positions” us? Do we have the knowledge to do so. It does not seem that the open common space of the internet, as described by Benkler, is routinely able to give us a “more critical and self-reflective culture.” Perhaps we can only hope that whistle blowers (Snowden) will continue to come along, as we, users might not be able to maintain a critical stance that is educated and robust enough to unearth these manipulations.

 

(1) http://www.benkler.org/Benkler_Wealth_Of_Networks_Chapter_1.pdf, p.1

 

(2) www.nytimes.com/…/facebook-vietnam-war-photo-nudity.html

 

(3)  https://www.researchgate.net/publication/261773451_ Rethinking_Critical_Theory_and_Qualitative_Research

 

(4) http://www.wsj.com/articles/five-things-to-know-about-facebooks-trending-controversy-1462915385

 

(5)   http://www.ibtimes.com/facebook-introduces-political-bias-training-after-trending-topics-controversy-2385911

 

(6)”http://www.jstor.org/stable/4309685 pg 435