What makes us feel?
From a biological perspective, it is proven that nerves located at integral parts of our bodies help us interpret external stimuli that come in contact with our body. The amygdala in our brain is a limbic structure that helps us process emotions and is a component that makes humans unique. The way our bodies have evolved have made us into analog creatures that react well to external stimuli in the natural world and this in turn has helped us become highly adaptable to earth’s different environments (Norman, 2008). From a technological standpoint, what happens when we begin to try to build machines to be more like us? What happens when we want our machines to then replicate our innate emotions or our psyche, to perform for us?
These were questions that I thought of when I was attending UXPA’s Emotionally Intelligent Design Workshop on February 16th. During this workshop, Pamela Pavliscak, a specialist that studies the relationship between our emotions and technology, asked us to partner up and design an app or piece of technology with human emotion in mind. We were required to use two themes as the basis of our invention. For myself and my partner, we had to create a dating app for people that are single. To help us create our invention, Pamela offered examples on how the tech industry has already began using forms of emotion, like our gestures and tone of voice, to implement design features that help build programs that react to us. Their reactions to our emotions will then prompt the machine to respond in a way that’s human, but not quite.
An example of this is SimSensei, a virtual human interviewer, which was created as a means to help health care professionals make more informed decisions on their patients based on their responses to the virtual interviewer. SimSensei is represented by a virtual human named Ellie, who is programmed to conduct interviews that help “…create interactional situations favorable to the automatic assessment of distress indicators, defined as verbal and nonverbal behaviors correlated with depression, anxiety, or post-traumatic stress disorder” (DeVault et al, 2014, p. 1061). Essentially, by creating a virtual helper like Ellie, people at risk of certain mental health disorders can feel they can open up to her, and in turn they can receive the right treatment. Patients are often misdiagnosed in the medical field so I think SimSensei has the right programming to flag warning signs of a particular disorder (keep in mind that it is mainly being used in diagnosing mental health issues).
In my honest opinion, it almost feels like Ellie has been programmed to trick patients into thinking they can trust it. During the course of an interview, the patient is being monitored, and every question Ellie asks is to create a response from the patient, either through speech or through facial changes. Here is a YouTube video that will help you see what sort of questions Ellie is programmed to ask to during her interviews and the type of facial tracking the machine uses.
Another great example offered to us is Toyota’s 2017 movie on a futuristic vision of how some cars may be developed (access it here ). The car featured in this short movie is a concept model, along with the AI named “You-ee” that is built into it. We see aspects of the car’s AI offer advice, act as “wing-man”, and my personal favorite – give positive reinforcement. During the workshop, only the clip from 5:45 to 6:34 was shown. Seen in its entirety, we get a glimpse into what an emotionally intelligent system can do for us. By giving something like “You-ee” human-like qualities (like its ability to make a joke out of Noah’s messy hair), it allows us to view the car as an extension of ourselves. More importantly, I think having a dependable AI is something that will allow individuals to flourish and establish better ties with their human counterparts.
Learning about the different types of emotion-based systems that are already on the market reminded me of Phoebe Senger’s remarks on AI being “..autonomous agents, or independent artificial beings” (Senger, 1999, p.10). We can, at this point, say that Ellie is a step away from being an autonomous agent. Although SimSensei is only currently being used to help doctors diagnose mental health patients, won’t this tool eventually be programmed to perform the the diagnosing by itself and then also administering treatment?
After reading Senger’s article, I now understand how the effects of implementing emotion into our programs can push our machines to the next level. Ellie is programmed with a voice and is made to be able to connect to humans so that we can better understand our own species. We will always be building towards the future, but we always want to keep our connections to one another close to us. After all, humans are empathetic and this quality will be incorporated into the things we create. “You-ee” a perfect example of how the relationship between human and AI can potentially be a harmonious union.
At the end of this workshop, all the groups presented their designs and prototypes. My partner and I decided to create a dating app that required all users to scan a full body image of themselves and display it on their page. Since I’ve never used a dating app before, I was never subjected to the cruel reality of them. According to my workshop partner, dating apps can make finding a partner relatively uncomfortable and weird. Therefore, by implementing a way to circumvent the feeling of discomfort and dishonesty, we believed having your entire self displayed is a great way of creating a more open dating world. But you may ask at this point: “Where’s the portion of your app’s design that makes your prototype emotionally intelligent?”.
And I will answer: “We’re not at that point yet”.
- DeVault, David et al. (2014). SimSensei Kiosk: A Virtual Human Interviewer for Healthcare Decision Support. 13th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2014. 2. 1061-1068.
- Norman, Don A. (1998). The Invisible Computer: Why Good Products Can Fail, the Personal Computer is So Complex, and Information Appliances are the Solution. MIT Press. Chapter 7: Being Analog
- Sengers, Phoebe. (1999). “Practices for a machine culture: a case study of integrating cultural theory and artificial intelligence.” Surfaces VIII.