UXPA@Pratt organised the ‘Emotionally Intelligent Design Workshop’ on 16th February 2019. It was conducted by Pamela Pavliscak and the theme of the workshop was ‘Love’.
The motive of the workshop was to give the participants a basic understanding of how emotion-sensitive artificial intelligence works and how to design the same. The session was broken into parts like that of a four-course meal. The participants were divided into pairs to mimic a setting of a date. Each pair was given a topic and a situation for which they had to design an emotionally intelligent device.
Each pair conducted an interview in relation to the situation provided to them, where one played the part of an interviewer and the other, the interviewee. The situations or problems given were all with the context of love like, cohabitation or being single. The devices to be made by the end of the workshop were to solve the given problems faced the by participants.
The workshop was well-structured and all the parts were highlighted right in the beginning. All the problems and solutions were personal and unique because they were in context with the participants’ personal experiences. Ways to uncover the emotions behind every design or prototyping steps taken were shown. Methods to design any device, not only mobile or web-based applications but physical products as well so that they can read and adapt to human emotions, were discussed.
The ways emotional intelligence shapes the future of technology were discussed, where AI would be able to interact with humans on an emotional level and as Sengers describes it “The hope is that rather than forcing humans to interface with machines, those machines may learn to interface with us, to present themselves in such a way that they do not drain us of our humanity, but instead themselves become humanized.”
There has always been a debate, whether AI is a benefit or a risk to the society. But this workshop emphasized on how AI and emotional design could be used to impact society in a positive way. The participants were made to explore the world of ‘Emotional Intelligence’ in a much deeper sense, which resulted in creative and adaptive designs at the end.
References:
Sengers, Phoebe. (1999). “Practices for a machine culture: a case study of integrating cultural theory and artificial intelligence.”
On Saturday, February 16, 2019 about thirty Pratt students
attended a three-hour workshop at Pratt Manhattan Campus called “Emotionally
Intelligent Design” hosted by the school’s chapter of UXPA (User Experience
Professionals Association). The event was led by Pamela Pavliscak, founder of Change
Sciences—a design research studio focused on emotionally
intelligent design, author of the fall 2018 book Emotionally Intelligent Design (O’Reilly), and current faculty
member at Pratt Institute.
According to the invitation posted on the Pratt School of
Information listserv, the objectives of the workshop were to teach students how
emotionally-sensitive AI tools work, as well as methods to prototype, test, and
evolve experience with emotional intelligence. During the workshop Pamela shared
a statistic that emotion-centered AI products and tools will be a $50 billion industry
by 2025 and it will be integrated with most industries. Anecdotally, we are
already seeing major trends in this direction, for example, in the realm of
online dating, facial recognition technology, voice assistants and chatbots. Yvonne
Rogers’ “New Theoretical Approaches for Human-Computer Interaction,” supports
this claim by explaining that due to the rapid pace of technological developments,
new opportunities are being created to augment, extend, and support user
experiences, interactions, and communications. Therefore, designers and
technologists now have new methods and practices to conceptualize and evaluate
a fuller spectrum of interactive products to support a broader range of goals (e.g.
aesthetically pleasing, motivating, fun) and evoke an emotional response from
users or participants.
The format of the workshop was students participating in several
different activities completed in 2-3-person groups that were interspersed with
short presentations about tools demonstrating technology imbued with elements
of emotional intelligence. Some examples of technologies introduced included: social
robots like Pepper, healthcare support technologies like SimSensei
that use facial reading and other biomarkers to sense emotion, CrystalKnows
which uses social media and text (i.e. email) data to aid in better
communication with coworkers, candidates, etc., Affectiva
which enables facial emotion analysis in context, and the Toyota
Concept-i car that “anticipates” users’ needs to create a better driving/riding
experience.
We began with an ice breaker asking some of the questions to fall in love in our small
groups. Once acquainted, each group was assigned a specific context (i.e. conflict)
and a challenge (i.e. building empathy) from which we would operate and ideate
throughout all of the other activities. My partner and I we completed an
interview where we discussed a specific conflict. The scenario that my partner
shared was that she and a friend were attempting to find an apartment together
while the friend was based in New York and she was out of the city for the
summer. It posed a challenge because only the friend was able to view the apartments
in person. Communicating about desired apartment features was a challenge as
well as being completely transparent about priorities. The situation became so
tense and uncertain that, in the end, they eventually decided to not find an
apartment together.
This scenario framed our further explorations into sketching
and visualizing what happened to the relationship over time and what sensory
experiences were involved. By the end of the prototyping, my partner and I had
sketched a mobile app complete with front-view and self-view cameras with
embedded sentiment analysis software so that a remote person could view a
physical space while the person showing the space could get a sense of how the viewer
feels about it. In our pitch to the rest of the groups, we said this type of
app could help in a number of scenarios: roommate to roommate, realtor to
potential tenants, venue managers to clients and more. It would potentially save
the time, money, and hassle while offering communication tools and insights to
help people make good decisions and become better communicators.
My main takeaway from these somewhat abstract activities were to keep the people and context centered in every part of the process and to allow myself to be surprised in the process of discovering solutions. With this conclusion, I am reminded on Don Norman’s “Being Analog” essay in which he describes a false dilemma: we can continue trying to make people more like computers – precise and logical and unemotional, or to make computers more like humans: creative, resourceful, attentive and able to change. When, in fact, humans and computers can elevate one another to ultimately help humans evolve and deal with the ever-evolving complexity of life.
References
Norman, Don
A. (1998) The Invisible Computer: Why
Good Products Can Fail, the Personal Computer is So Complex, and Information
Appliances are the Solution. MIT Press. Chapter 7: Being Analog https://jnd.org/being_analog/.
Rogers, Yvonne. (2004) “New theoretical approaches for human-computer interaction.” Annual Review of Information Science and Technology 38: 87-143.
Sengers, Phoebe. (1999). “Practices for a machine culture: a case study of integrating cultural theory and artificial intelligence.” Surfaces VIII.
UserTesting is a platform used to receive rapid customer feedback across different interfaces. UserTesting is most commonly known for remote moderated and unmoderated usability studies. I recently watched a webinar by UserTesting titled “How Product Insights uses UserTesting” which essentially explained how their product insights team uses their own platform to scale research within teams as well as within the entire company.
The webinar was essentially broken into three points: Data Science & User Experience (UX), access to customers, and enabling others. However, I was particularly interested in the first segment – the relationship between Data Science & User experience (UX). There were several connections to topics that we’ve discussed in class. The speaker, Josh Kunz whom is a senior UX researcher at UserTesting, placed most of the emphasis on User Experience. He explained that they attempt to connect data science and UX research in order to ask and answer impactful questions that ultimately affect the human-centered design. The discussion on this topic touched on research methods, human-computer interactions, and human-centered design.
It was interesting to see the different research approaches that the UserTesting team take, in relation to the methods that we’ve discussed in class. The speaker did not make a clear distinction between qualitative and quantitative research approaches, but it was apparent through his explanations. He elaborated on the UX research process, which greatly resembled the qualitative approach – using interviews and focus groups. Additionally, he briefly discussed the data science approach which was more of a quantitative approach using statistical modeling, predictions, and algorithms to answer and ask questions. However, it seemed that with the data science approach, this team only analyzed large sets of data that they currently have in their database, closely resembling secondary research, verses collecting and then analyzing data as a primary research.
During the webinar he walked through a scenario in which UX researchers wanted to see if their perception of how customers used UserTesting matched how customers actually use it. This curiosity came about due to an observation by UX researchers noticing that customers would make copies of tests. As discussed in McGrath’s article, “Methodology Matters: Doing research in the behavioral and social science,” we’ve learned that observations are a qualitative research approach (1995). Data scientist then found out that about 80% of tests are copied with then another 80% of those tests are copied again, which is essentially a vast majority of their customers. Data scientist found this information out through a process of modeling and querying data, closely related to quantitative approaches. The UX researchers then performed both in-person interviews and focus groups with their users to gain an understanding of why customers created these copies. Interviewing and focus groups are another qualitative approach that we’ve both read about and discussed in class (McGrath, 1995). They ultimately found that customers create “chains of tests” which data scientist later ran even more statistical modeling resulting a in visualization that showed how all the tests were related. Finally, the UX researchers then performed another round of interviews which essentially acted as a final set of validations for the previous findings. The switch between UX research and data science is closely related to a mixed methods sequential exploratory design, where one team is essentially collecting data and another team is analyzing or validating it (Creswell & Creswell, 2018).
Ultimately, this research helped UserTesting redesign their interface. This is actually related to another set of topics that we’ve touched upon in class: human-centered design & human-computer interactions. For example, the purpose of this iterative process is to ultimately figure out how the user is actually using the product. As I was watching the webinar, I thought of Wilson’s article, “Human Information Behavior,” in that the focus is not on the system but rather the user (2000). I also feel that this process as a whole pull in principles from human-computer interactions. The research is primarily observing human behavior and analyzing it in relation to the interface to design appropriately. At the end of the speaker’s anecdote, he explained that these findings helped them design with multi-study projects in mind, since this is majority of their audience. Additionally, they also adopted the Google “hearts” frameworks, which was an instrument that I was unfamiliar with.
The Google “hearts” framework does an excellent job marrying UX and data science in that is covers five metrics that both teams are able to measure. These metrics are happiness, engagement, adoption, retention, and task success. Engagement, adoption, and retention are metrics that data scientist are able to measure while UX researchers are able to measure happiness and task success. (Click here for an article that explains the framework more in-depth.)
I thoroughly enjoyed this webinar. I thought that it was really interesting to see how they use their own platform to perform research. I also never really thought that UX research and data science would be highly complementary. It makes sense to think of this as a mixed-methods approach in that the strengths of one team will eliminate the weakness from another. For example, data scientist found that a majority of their customers were creating copies of test, but they would not be able to figure out why they were doing this. The UX team are able to take a most human-centric approach to understand this behavior and actions. I suppose that another distinction that could be made would be to say that data science seems product-centered while UX seems human-centered.
References:
Creswell, J. W., & Creswell, J. D. (2018). Research design: Qualitative, quantitative & mixed methods approaches. Thousand Oaks, CA: SAGE.
Mcgrath, J. E. (1995). METHODOLOGY MATTERS: DOING RESEARCH IN THE BEHAVIORAL and SOCIAL SCIENCES. Readings in Human–Computer Interaction,152-169. doi:10.1016/b978-0-08-051574-8.50019-4
Wilson, T. D. (2000). Human Information Behavior. Informing Science: The International Journal of an Emerging Transdiscipline,3, 049-056. doi:10.28945/576
How does the archive become a space of engagement? What are the ethical obligations of the archive? How do we draw attention to otherwise invisible voices? How does raw data become material for surveillance? Who owns the past? These were the questions that guided “Archiving Colonialism” a panel discussion hosted by Barnard College’s Center for Research on Women, as part of the larger conference “The Politics and Ethics of the Archive.” According to keynote speaker Elizabeth Castelli, the theme was inspired by audio of earlier feminist conferences, and how the process of digitization led to larger questions of use and ownership. As the discussion progressed, it became clear that reaching a final answer to any of these questions cannot and should not be the goal. Rather, archives should be spaces where continuous discussion is encouraged and continuous access fostered.
The archive has long been a site of contention. Once perceived as purely objective towards history, there has been a recent push to consider archives through a post-modernist lense—as fluid spaces of ongoing debate and discussion, rather than static sites of fixed history and narrative. As Joan M. Schwartz and Terry Cook state in Archives, Records, and Power: The Making of Modern Memory, “…by treating records and archives as contested sites of power, we can bring new sensibilities to understanding records and archives as dynamic technologies of rule which actually create the histories and social realities they ostensibly describe” (Schwartz, Cook, 7).
Despite differences in profession, this emphasis on the archive as a device with which to create history was shared by all three panel speakers. Moderated by acclaimed writer Saidiya Hartman, the three speakers included La Vaughn Belle, a multi-medium visual artist, Justin Leroy, a professor and historian, and Cameron Rowland, a visual artist. Notably, the panel featured no archivists, which I found to be compelling. How could the discussion be shaped by people who had a more dynamic relationship with the archive and don’t interact with it on a daily basis? What kind of direction could it go in?
The panel began with Justin, who discussed the relationship of the Black slave to the archive, and the collective cultural assumption that history moves in one direction. Similar to feminist scholarship, the slave’s relationship with the archive is historically one based on absence and the assumption that the voice of the slave carries no significance. He gave the example of a letter that philosopher Georg Wilhelm Friedrich Hegel wrote stating that Africa “is no historical part of the world.” Moving forward from this flawed ideology, Justin explained, the popular notion has been that the recovery of history is necessary to achieve social justice. But, Justin questioned, what is the benefit of being “unfit” for history? What new narratives are uncovered from the vantage point of being outside history?
Approaching the question as a historian rather than an archivist, Justin described the narratives of free slaves as shaped by perpetual subjugation by history. In spite of the technical abolition of slavery, Blacks would continue to be beholden to the oppressive structures of capitalism that underpin American progress. Capitalism and American history run in parallel to one another, with racialized conceptions of monetary value remaining constant. If things exist beyond the simple binary of life and death, it contorts our idea of time as linear. But, as Justin concluded, if we allow other trajectories of history to permeate the cultural understanding, we might be able to “find the language for more aspirational freedom.”
Justin’s idea of taking a more aspirational approach to history, and an eye towards the future as well as the past strongly echoed the writing of Roy Rosenzweig’s Scarcity of Abundance? Preserving the Past in a Digital Era, which urged historians to “shift at least some of their attention from the past to the present and future and reclaim the professional vision that was more prevalent a century ago” (Rosenzweig, 739). It is a disservice to narrow the vision of history into one linear path.
The next speaker, Cameron, shared this idea of the archive and what it represents as being intrinsically limited in Black narratives. His main example was the concept of reparations and how its discourse opposes the historical constructions of time and monetary value. In his art, Cameron uses historical documents to oppose capitalism. He presented one of his most recent works, “Burden of Proof,” which uses maps of 8060 Maxie Road, a property repossessed by former slaves during Reconstruction. The property was purchased in 2018 by a non-profit in order to implement a restrictive covenant so that the land cannot be used again. The land is valued at $0 and cannot be used based on the stipulations of the covenant. How then, Cameron asked, can this force us to rethink the notion of reparations as value-based and relegated to property? The lack of historical documents relating to this property show us the value in a limited archive, Cameron argued. How can we look beyond history to rethink the role of capitalism in reparations?
Scarcity in the archive and the narrative freedom it allows for were the central interests of artist La Vaughn Belle, the next speaker. Primarily focused on the Danish colonization of the Virgin Islands, La Vaughn described the Virgin Islands’ archives as splintered, due to acquisition by the Danish government. Because of this archival scarcity, La Vaughn argued, the memory of the islands had to be reproduced in alternative ways, which she explores in her work. For example, Chaney are fragments of Crucian pottery that often wash up after storms. La Vaughn collected these fragments and used them to create “process paintings,” to fill in the gaps. The lack of completion in the archive allowed her to utilize her imagination, which presents a necessary challenge to colonialism. In order for the archive to be a tool of resistance and fluidity, some scarcity is essential, she argued.
During their discussion with one another, all speakers challenged the idea of the archive as a place of necessary abundance. Justin presented the idea of “reading practice,” a method he uses in teaching, which emphasizes not what is present or absent in research, but what you do with what you find. La Vaughn emphasized the overlap between history and visual arts, and the need to make metaphors in both fields. Cameron added that the idea of accumulation in history is a byproduct of capitalism that should be reconsidered. The archive, all agreed, should be a space where one can create their own metaphors for the past and future.
In the end, I appreciated that no archivists were included. I felt that by allowing for more creative perspectives, those with a vague understanding of archives could be exposed to a broader view of their purpose. As I left the panel though, I quite honestly felt like I had my work cut out for me. What authority do I have to fill in the blanks of history? As an archivist, do I have the right to incorporate creativity into my work? But as I considered it more, I thought of how archives can never truly be complete. We can never truly possess every artifact of history; why even try? As the speakers showed, archives must have an element of creativity to challenge dominant narratives. Perhaps the point of archives shouldn’t be to merely present history as it was, but to provide an idea of a better future.
By Sarah Goldfarb, Info 601, Professor Chris Alen Sula
Schwartz, Joan M. and Terry Cook. “Archives, Records, and Power: The Making of Modern Memory.” Archival Science 2 (2002): 1-19.
Rosenzweig, Roy. “Scarcity of Abundance? Preserving the Past in a Digital Era.” Oxford University Press (2003): 735-762
I attended an Open Data Week event about Data Stewards. I had heard about Open Data Week through the Pratt School of Information google group and just an event in Manhattan at a decent time, 6pm, on my free day and answered the Eventbrite RSVP. I arrived on 21st St in the Flatiron District and my iPhone’s mail app began failing and displayed the subject-line of the RSVP but not the email, despite refreshing and full reception. Ironic for attending an information science event. However about four other people were immediately chatty and introduced themselves when we realized we were all trying to go to the ninth floor in the wrong building. If not for them and their awareness of two locations in association with the event or its sponsors, I probably would have been lost.
The correct building was a stone’s throw down the block. I didn’t try to ask these fellow attendees numerous of questions but did if they worked at the same place, as they clearly knew each other. They said yes and no, equivocally, probably yes at some point, no now, and yes of course to having similar interests in data. We all arrived at the correct floor in the correct building, I believe a WeWork space. My interpretation was most present were some form of software engineer, and some with nonprofits, and showed up from a share of self-interest in access to data for their projects, and what the so-called ‘open data’ landscape looks like and is aspiring towards.
There were many free sandwiches and beers which gave the event a specialized feel. I had sandwiches and a La Croix. In ten minutes or so of chatting before the panel started I talked with a guy standing near me, slightly older, who said he had a software project he was in the process of boarding to a friend’s private sector company. I asked a few elementary questions informed by my first two months at Pratt like if it was a database, in a cloud, and if SQL. He said it was noSQL. I’ve found even so far there’s a consistency of ideas or themes once you start discussing projects in the data community.
Additionally a survey was handed out at the beginning, and from its wording intended to be filled out at the panel’s conclusion, though no one called for them and I didn’t see a bin to return them. I still have mine. It’s not specific to this Data Stewards event here but Open Data week in general. One of its most telling response options I thought was the most advanced option for “What is your level of data expertise?” Which goes “I am a data expert with no fears, who is happiest when given a messy dataset to wrangle”. In addition to the rapport of the group this suggests to me the event or week in general is consciously for advanced information engineers.
The overall slant of the panel and attendees, I gathered, was about prying data from the private sector, and that those attending had projects which could use it. However as it went on many comments were made about the private sector as an efficient beast that is ready to sell and even compete with its data. Everyone there wanted ‘data collaboratives,’ (private, nonprofit, government) to become more systematic and sustainable. They wanted more ‘piloting’ and prototyping and predicted a ’reimagining of statistics in the 21st century.’ However there were striking differences between each of the three sectors discussed, several of which openly acknowledged.
The private sector had the most need to reflect on its biases, as its interests could change and such a company would typically also have a desire to ‘get its name out there.’ Sometimes it’s even tricky for them to get involved in a data agreement if it’s more a long than short-term profit. Cubiq, a three-year old startup for location intelligence from consumers, had a representative pesent named Brennan Lake who spoke about its Data for Good program. Using opt-in smartphone app data to supplement a natural disaster response program, and he mentioned in particular a focus on giving to data right to natural disaster professionals who can appropriately use it.
However it was also acknowledged across the board that access to data can sometimes come before genuine solutions or use protocols. Rules and a contract repository were mentioned as desired. Estonia by contrast already has legislation for data sharing and Denmark, from which a statistician was present, pulls its census results from admin data, employing two people. Nick Eng from LinkedIn also noted using information they already have uses about two analysts compared to an external project. Brennan from Cubiq also spoke about ‘figuring out the ask’ as being as a difficult part. Privacy as a topic of beforehand attention and cost was also highlighted in particular by Nick from LinkedIn. In these upfront negotiations Lake mentioned ‘privacy by design paradigm,’ and Eng emphasized the cost of producing a sharing agreement that is ‘as hard as possible to abuse,’ but that also being the only way they were willing to enter sharing agreements.
I can think of several connections to design, and identity and concept politics from Foundations course readings. Talja & Hartel in their look at user-centered data favor a turn more to the audience or user in an effort to reflect more realistic demographics, situational contexts, and not just investigate how researchers are using a system and if their ‘needs’ are met. This is similar to a turn to individual researchers, or so called Stewards for private companies, in reflecting on the information they formulate and seek, and their culture. It did feel like a tech culture to me at the event, although the most straightforward panelist I thought was the Adrienne Schmoeker from the Mayor’s Office of Data Analytics, a new office employing about eight people. The Mayor’s Office has an advantage of being an ‘enterprising organization,’ she said, always minding to serve the city’s 8.6M people. Nonprofits by contrast, more like government than private in this respect, can be much less efficient in contract production and may be just trying to keep the lights on in their offices. A private company rather may have more of a sense of ‘giving back,’ for using city services and frequently census data.
It seems like in an imaginable future more companies and even individuals may seek data—Schmoeker from the mayor’s office anticipated eventually having an open help desk for data, but right now they address matters like STEM (Science, Technology, Engineering, Math) funding for schools, free lunches for kids, ambulance speeds and tenant abuse. However as she said earlier, “there’s no ideal dataset,” and just a live stream without history doesn’t highlight that much that is useful. Another panelist echoed that if it’s less private, it’s more futile. It seems to invoke a more conceptual turn in use evaluation, in other words not just “task oriented” (Talja & Hartel, 2007) but turning to users with what to me seems like situational awareness and occasional cynicism.
Similarly, I can relate information needs or a burgeoning ‘outlook’ methodology to design needs and the idea of a an axis that actually dishes out preference on multiple traits while representing one, as Constanza-Chock says in her piece on design justice. There are, it seems to me, mechanized intersectionalities, like looking more dryly at how people use a system, or what biases are implicit in their needs (looking at private companies or individual researchers) versus conscious intersectionalities, on which Constanza-Chock mounts the identity of Black Feminism, like looking at how users have conceptualized or contextualized their information (needs). Some of this may include parsing hidden intersections.
To me it seems like there is an interest in both delineating information by designers, in the “supply chain” (Sayers, 2018), as it were, and in allowing researchers and groups to self-pool data and identity that is increasingly, one would hope, less intersected by an axis that addresses that need only in a shadow, as Costanza-Chock references even some particular community centers as sites of oppression and resistance.
Given the axises already in place, I agree that it depends on a turn from looking at systems to biases in groups, and from that changes is design to deconstruct shadow interests. It was clear even this Open Data Week event existed in a particular culture. I think we are at an excess of intersections with everyone on the web and there is a need, in myself at least, to locate earlier in timelines, and parse interests that are disadvantageously melded. In my experience this has to do with looking and working before and after points of apparent significance. Data professionals are already looking for granularity of information, as Nick Eng from LinkedIn mentioned in preference to surveys. A move toward reflection and granularity in interpreting users (or researchers) is to me what seems most important, as there may be as much to deconstruct there as in a ‘system.’ A heightening of design theory may logically follow. One of the panelists also mentioned the MIT Media Lab, which encourages “anti-disciplinary research” and already tracks mobility data to gauge housing inequality in and around Boston. It was clear and refreshing at any rate that all attending seemed to be geared by outside the box thinking, at least as perceived by me.
References
Talja, Sanna & Jenna Hartel. (2007). “Revisiting the user-centered turn in information science research: an intellectual history perspective,” Information Research 12(4).
Costanza-Chock, Sasha. (2018). “Design Justice: Towards an Intersectional Feminist Framework for Design Theory and Practice.” Proceedings of the Design Research Society 2018.
Sayers, Jentry (2018). “Before You Make a Thing: Some Tips for Approaching Technology and Society.”
Boo-Horay is an organization dedicated to the archiving of ephemera, photography, and book arts from 20th and 21st century counter-culture movements. They operate mainly from their small space on the third floor of 277 Grand Street in Lower Manhattan. On December 12th, I decided to pay the space a visit via a completely unannounced walk-in.
Despite the unexpectedness of my presence, the staff at the space were very easy-going, friendly, and accommodating. The space is divided into two areas: one more public-oriented where they host events and exhibitions in addition to (at least for the time being, due to lack of workspace) process archival materials, and an office where most of the staff can be found at their desks working on various projects. In tandem with the staff’s artistic backgrounds and interest in counter-culture, the space is organized in a very informal way. Nevertheless, their resume is very impressive. The organization has handled archives from the likes of Larry Clark, Ira Cohen, William S. Burroughs, and Ian Dury, as well as putting together the Paris-May 1968 collection at Yale University’s Beinecke Rare Book and Manuscript Library, and the Hip-Hop History Archive at Cornell University.
Upon entering, I was greeted enthusiastically by Johan Kugelberg, the organization’s founder and head curator. Johan is a colorful person with a fascinating background, of which I am unable to go into detail about here. He was going to speak at 8-Ball Community, a zine archive and library, later that evening. After I spoke to him about being an archiving student and needing something to write about for my assignment, he introduced me to Beth Rudig, the Director of Archives at Boo-Hooray. Beth studied film at SUNY Purchase and has been with Boo-Hooray since 2015. She tells me that while she does not currently have a library science degree, she plans on acquiring one in the near future. She is currently working on a project pertaining to the costumes of renowned performance artists Joey Arias and Klaus Nomi.
When I brought up digitization, Beth told me that most of the work is outsourced and that very little digitization happens in the space itself due to lack of resources. Staff makeup is very small, and they only have a few interns and volunteers. Aside from Beth and Johan, the remaining key curator is Daylon Orr, Director of Rare Books and Manuscripts, whom I unfortunately did not get a chance to speak to. Recently, he prepared Boo-Hooray’s first rare books catalogue. The organization’s next big project is the acquisition of an archive from a renowned filmmaker. When I asked who it was, they told me they would tell me, but then they would have to kill me.
Reflection
In his interview with Saturdays Magazine, Johan brings up a much-discussed aporia within the world of archives in the 21st century, and one in which we have spoke about in length in class. That is, the tension between the rise of digital technology, and the archive’s purpose of preserving past artifacts and narratives. Archives are pushed to embrace digital technology as a means of preservation, and advocates for change usually point to the inherent impermanence of material objects. As Michele Valerie Coonan writes in his paper W(h)ither Preservation?,
“The paradox of preservation is that it is impossible to keep things the same forever. To conserve, preserve, or restore is to alter. Even if an object survives untouched, it will have changed just by virtue of aging or by a change in its surroundings.”
But many feel that the increasing digitization of everything un-anchors us from the material world, as well as other people. As Johan articulates,
“A lot of people talk about content and archives that are born-digital and, I mean, look at all of our smart phones lined up right now and look at how you have thousands of text messages and photographs. You know your personal narrative of the last four or five years is contained in this machine. Our relationship to our memories has changed because we feel everything that’s born-digital is so super ephemeral that it actually has less meaning. On one level that can obviously liberate people from feeling too anchored in their past, but the flip side is that it really boosts this sort of hyper-individual sense of the perpetual now.”
This tension is a microcosm of a larger epochal problem in our culture brought about by the advent of digital technology, which we can broadly understand as the tension between modernity and postmodernity. While modernity seeks to retain a universal order that allows for rational communication, postmodernity argues for the embracing of the inherent irrationality, ephemerality, and atomized individualism of the digital era. While I would not dare to propose any sort of solution to such a monumental problem, I would simply conclude but pointing out that this tension is not necessarily one that needs to be resolved and that, as Jean-Francois Lyotard (the French philosopher who wrote perhaps the most influential text on postmodernism) once argued, modernity and postmodernity actually always co-exist. With this in mind, perhaps we can imagine the possibility of archives both embracing digital technology while retaining its traditional mode of preserving artifacts.
Pratt Library has two branches, one in Brooklyn and one in Manhattan. Pratt Library in Manhattan is a much smaller branch compare to the Brooklyn branch. With Wilson’s information behavior in mind, for this observation, I was interested to see how a small school library operates and if they were able to make full use of its limited space to serve its patrons and embrace the community.
Physical Space Setup
Even though the physical space is relatively small, the setup of the library makes it seem very spacious. There are nine book shelves in total, however, there are plenty of room between each shelf. In one of the blocks, a short stand is also provided for patrons to reach books on a higher shelf. Such design reflects the accessibility principle and provides convenience for patrons who are in search of information.
It also came to my notice that the librarians made a lot of display space for the books. For instance, an entire wall is designed for journal/magazine display. A great majority of the journals are information library related, and less than five journals of the entire display is non-information related, such as Harvard Business Review, Rolling Stone and New Yorker. It is interesting to see that they also display quit a few children related librarianship magazines, such as the Horn Book and Children&Libraries. On the other hand, I only see two or three museum or archival related journals. I think the library did a great job in representing a variety of topics that could relate to the students, but the proportion of each topic needs to be adjusted. For information behavior, this large amount of display provides more opportunities for patrons who are not actively searching to receive new helpful information.
Welcoming Environments
I also noticed that about half of the library space is designed for patrons to relax or work and study. By the windows, there are two big tables which allows group study. A coffee/tea cart is placed next to the two big tables. On the side, there are four small sofa where students can study on their own comfortably. There are also green plants on the windows. During my stay of three hours, more than half students used and stayed in the relaxing area to work on their own laptops. However, the seats were never full. There were always seats available. It shows that the relaxation space design meets the needs and expectation of Pratt students. Students use it as a learning space to acquire new information, regardless of whether the information is coming from the physical library. There are aspiring quotes about library and books. It led me to think of how libraries is integrated as an important part of the community. It is the environments that the libraries provide to its patrons to become more information open and available.
The Reference Room is at first sight when people come into this library. The words below the “REFERENCE” says “Please Enter”. It is inviting and decreases the worries and shyness of entering a room (a closed space) for help. When I talked to the front desk about looking for a book that I was interested in, she was very friendly and helpful. She told me that even if they didn’t have the book, they could request it for me from the Brooklyn branch. I appreciated that a lot, because it is indeed an disadvantage of fewer resources when coming to smaller library. The fact that the two branches can share resource and make them as available and as convenient as possible to their patrons definitely helps a lot.
Overall, it is interesting to see how a school library operates within its community and use its space and service design to help and accommodate its patrons. Despite that the Manhattan branch is already doing a great job in serving its students, I do hope it will be more active and involving in the community by holding some small events, such as themed book week or small talks on popular information topics.
Theory used: Wilson, T.D. (2010). Fifty years of information behavior research. Bulletin of the American Society for Information Science & Technology, 36(3), p.p. 27-34.
I visited with Stephanie Neel at the Mark Morris Dance Center on Friday November 9th. Neel is overseeing a group of archivists working on a large-scale project at the Center in Fort Greene, Brooklyn. Her team has been making diligent progress towards digitizing the Center’s library of VHS and pneumatic tapes.
History of the Mark Morris Dance Center
The Mark Morris Dance Center, located one block west of the Brooklyn Academy of Music at the intersection of Lafayette and Flatbush in Fort Greene, Brooklyn, has been the home base of the Mark Morris Dance Group since 2001. The Center was the first building to be dedicated solely to a dance group, and serves an additional function as an education space and outreach facility for the community. The Mark Morris Dance Center offers many affordable and inclusive classes to the community and are not prejudicial with regard to experience or ability.
The Team
Neel is conducting this project in consultation with Greg Lisi and Savannah Campbell. Lisi and Campbell are video digitization specialists employed by the Dance Heritage Coalition. Lisi is also the moving image preservation specialist for the NYPL and has overseen all of their AV digitization efforts for the past ten years. Campbell is a graduate of the NYU Tisch School’s Moving Image Archiving and Preservation program. This team is rounded out be Regina Carra, Archive Project Metadata and Cataloging Coordinator, and Sarah Nguyen, a University of Washington MLIS student.
Funding
Neel and her team have been producing their work in accordance with a three-year Mellon grant, which is specificly tailored to the Mark Morris Dance Center. The grant is compliant with current digitization standards, and is aligned with OMEKA, a performing arts database standard. The main objective of this work is to organize and digitize their large holding of pneumatic tape, beta, VHS and high eight.
Archival Process
Neel and her team begin by cross-referencing the individual records with open source software. This method is similar to that which is employed by the NYPL and the Tate in London.
The primary challenge of this work is in coordinating between Mark Morris and the various institutions throughout the world that commission dance pieces from the institute. Each of these institutions employ their own videographer, and therefore maintain proprietary usage rights of their footage. This footage then resides in a cold storage facility. Mark Morris must then request an extraction of the digital files from cold storage. The files are then checked for compliance with the Collective Access. Collective Access is database software technology for use in cataloging.
Further Challenges
The archival process at the Mark Morris Dance Center poses exciting challenges. These challenges are best illustrated by Michelle Caswell’s article “The Archive” is Not An Archives: Acknowledging the Intellectual Contributions of Archival Studies. In this article, Caswell identifies the importance of the record in archival practice. She writes: “The ‘record’ is the foundational concept in archival studies. Records, according to the prevailing definition in archival studies, are ‘persistent representations of activities, created by participants or observers of those activities or by their authorized proxies.'”1 Neel and her team of archivists and preservation specialists are sifting through a various forms of records in their process and must create separate hierarchies.
Neel and her team are grappling with the archiving and cataloging of the so-called “uncatalogable.” They approach this problem by dividing the work into two aspects. One aspect is the choreography, which is authored soley by Mark Morris. The choreography is its own text. This text is then translated to other institutions that choose to perform the work with their own companies. The performances are a separate aspect of the process. They are made physical in the form of the recordings captured by each company’s individual videography department.
This process of sorting relates to Caswell’s definition of provenance. She writes: “Through provenance, archival studies insists on the importance of the context of the record, even over and above its content.”2 While content is important for Neel, the contextualization of the performance (when, where, which company) is the primary method of placing the records within the archive.
Outside Assistance
Neel has contracted with The MediaPreserve in Pittsburgh to complement the work being done in Brooklyn. Shipping crates come and go from the Center’s archival office. The crates are filled with analog reels and cassettes, a couple of which I helped carry up to the lobby. According to the website of The MediaPreserve: “We have digitized for hundreds of institutions, universities, and museums transferring an array of formats including 1” Type C, 2” Quad, video cassettes, digital videos, film, and many more. Our work has covered numerous genres, including home movies, propaganda, documentaries, and works of art, as well as news, scientific, musical and educational programs.”
Practical Use of the Archive
The digital resources, once archived, are not simply kept in a closet. The tapes are a vital aspect to the company’s process, and are heavily referenced by new dancers and other global dance companies in order to recreate the specifics of Morris’s choreography. A database exists for the dancers where they are able to access time-stamped footage of past performances and other forms of raw choreography that serve as the building blocks for new performances.
Secondary Goals
Neel’s team is also responsible for the large collection of costumes and ephemera belonging to the Mark Morris Dance Group. These costumes span the forty-year history of the Group. Additional items in need or archiving include historical prints, photographs, and programs. Most of these items are securely stored are of a less urgent manner for the team. The analog technology of the video tapes is more fragile and requires urgent attention. Neel has decided to tend to the costumes toward the back end of the grant.
Conclusion
Stephanie Neel and her team are dealing with an interesting challenge in archiving the digital materials at the Mark Morris Dance Center. They must parse through the records and create hierarchies of place and performance in order to assign order to their holdings. Their digitization and preservation methods are sophisticated and the team is composed of accomplished specialists in the field. The archive is unique in that these records will then become widely used as practical tools for instruction.
Sources:
Michelle Caswell, “The Archive” is Not An Archives: Acknowledging the Intellectual Contributions of Archival Studies.
In October 25th, I attended a workshop that was hosted by two senior UX designers from Athlon design company, which is one of the leading international design and technology in New York. During the workshop, they introduced their background and presented their projects to us and show how to create a brand experience in UX industry.
Speakers first asked about the definition of brand experience and each of us wrote it down on the notes. Before that, I never heard about the brand experience. My first opinion was to create an ideology of the brand, people will use a methodology to help consumer understand the brand and its products. According to my response, speakers brought up a recent design they have done.
In one of their projects, they created follow questions to help a user understand the content and also gather information about the usability and learnability of the content. Every time after users complete the task, there will be a pop up follow up question ask them their opinions & understanding about the task and the product. Users can directly express their expectations for the product and propose areas for improvement. After summing up all the opinions and combining the company’s image, the designer will be able to a product that fits the overall user experience.
Eventually, they explained brand experience is the use of technology, data, and storytelling to reshape the live experience. Brand experiential learning provides theoretical guidance for brands to use new media platforms reasonably and effectively, especially online media, online games and virtual worlds, as well as mobile platforms designed for brand experience. In the UX field, people will use visual information to present, such as typography, clean and structural hierarchy, balanced of images, text blocks, spacing, and error prevention. Its goal is to provide positive users experience from the product to help users understand the brand.
By combining with the brand measurement method and brand indicator system, it provides specific brand experience measurement and evaluation methods and practical tools for brand inspection and brand management improvement construction results. It can be said that the brand experience is a new star in the field of design and marketing.
Reflection
In my understanding, it is the ultimate goal of the brand experience in UX field is to put users in the brand atmosphere and enhance the brand’s appeal. As long as users keep interacting with the products, it not only reflects the humanization of the product but also increases the user’s awareness of the product and establishes their own opinions toward the brand.
The product represents the company’s brand and reputation. If there are subtle errors in the product, such as grammar, graphic links, it will directly affect the user’s experience and damage the corporate image. Since technology, a brand of products, services, and marketing have never been so closely integrated, what kind of service the product provides, how the service creates the users’ feelings, how users express themselves in technology activities, these three things must be transparent and users must be able to see products, services even marketing. In conclusion, a smooth experience is very important. For example, you can’t let users have to buy something and they have to do another 50 extra things to consume. Today users won’t have the patience for it. Besides, creating a brand experience will never be good enough since everyone can do it. The experience must also be inspiring, so users will feel you care about them and your product will truly become competitive.