Disney Original Animations Vs. Live Action Remakes


Final Projects

Introduction

      For this final project I thought long and hard about the subject matter I wanted to depict. Earlier in the semester I had made a visualization about the difference in the critic and audience scores of the highest critically rated movies between 2007 and 2017. This topic was entirely selfish because I often found that I disliked the highest rated critic movies immensely, and I wanted to prove that critic score really meant nothing to most people. To that end, I was sent a visualization (pictured below) from reddit featuring a subtracted difference of IMDB critic scores of Disney animated movies compared to their live action remakes. Having been burned many times by disappointment from this new fad of remaking your own movies with “live action actors” (those air quotes are pointedly looking at The Lion King 2019,) I decided I wanted to follow in the footsteps of my previous assignment and the reddit graphic and compare audience scores to critic scores of animated and live action movies. To further this, I decided I would also look at the Box Office sales and calculate the percent difference between the scores per movie. 

The Inspiration for my Project

      My goal in the end was to see whether this information might sway potential viewership of the live action remakes. My intention was to create two visualizations; the percent difference and the box office sales. I wanted to collect my own data, to ensure in terms of the box office sales that it would be as updated as possible, and to create my own data set so it would have no issue being uploaded into the software I used. 

Material

      The dataset used in this project was created using google sheets, the information largely from Rotten Tomatoes and IMDB. The audience and critic scores were taken from Rotten Tomatoes scores, and the gross Box Office sales were taken from IMDB. Certain movies might have lacked box office sale information, in which case the information was taken from wikipedia, with the exception of Alice in Wonderland 1951, which did not have a box office sale number and was omitted from that visualization. For those movies that were still making box office sales, like Mulan 2020, which is streaming on Disney+ with an added fee, the information was up to date as of December 3rd, 2020. (As of the time of posting on December 8th 2020, that fee has been removed.)

      After creating the dataset, the document was converted into an .xml and uploaded into a software called Tableau. The graphs were created using Tableau using two colors, which were saved as .jpeg. Those graphs were then uploaded into Adobe Photoshop where they were edited to display the information as stills of the movies they were depicting. All movie still images were taken from Wikidata. The final graphs were saved as .jpeg and then put together as PDFs to be emailed to UX participants along with a survey created with Survey Monkey. 

Process

     The process began with collecting IMDB release dates of the original animations and the remakes. I ended up using the exact movies the original visualization used after deciding my criteria would include movies that had the same title as their original remake. This meant they had to be a full length film, with an exact story line. This exempted the Sleeping Beauty/Maleficent remake, The Sorcerer’s Apprentice/ Fantasia remake, Alice Through the Looking Glass, and Winne the Pooh/Christopher Robin. They also had to have all data criteria, which exempted any planned remakes like The Little Mermaid or Hercules. After collecting dates and Box Office sales of both animated and remakes, I collected the audience and critic scores from Rotten Tomatoes. Then I decided I would do any calculations by myself to avoid computer error, or confusing Tableau. So I created fields in my data set for Audience and critic score subtractions for both animation and live action, audience score subtractions of animations and live action, and critic score subtractions of animations and live action. Then I calculated the percentage difference between audience and critic scores for each animation and live action movie by subtracting the audience and critic score for one movie, for example Dumbo 1941, which is 98-70 which equals 28. Then I added these scores, equal to 168 and divided by two, the number of variables I used, which was 84. Then I divided the subtraction number by the average I just calculated which was 28 divided by 84. Then I multiplied this by 100 making it into a percentage. This gave me the percent difference between Audience and Critic score for Dumbo 1941 which was 33%.

My Dataset

      With all of my calculations done, I uploaded the Google sheet into Tableau and created what I called my base visualizations. I wanted them all to be a bar graph, as I thought they would be the easiest to read for the information I was presenting. With my base creations made, I saved each graph, which at that point were four, as jpeg images. I then uploaded those images into Adobe Photoshop. Using Photoshop I uploaded movie image stills from both the animations and live actions of each movie, made each animation image black and white by lowering the saturation, and cropped them to fit into the bars of the graph. This created my final image as jpegs of four bar graphs. One for the box office, one for audience scores, one for critic scores, and a final one for the percent difference.

       I decided early on that I wanted the visualization to be fun and include representations of each movie itself. Though it is usually in best practice, according to O’Reilly’s Fundamentals of Data Visualization, to not use too many colors, or to not “use color for the sake of coloring” I believed my idea to be exempt from this. As color was mainly used in two fields, the animation which was black and white, and a colored image for the live action remake, I believed that this made a clear distinction between the two categories and did not require the viewer to have to figure out what the colors meant. Nevertheless I still provided a written key to distinguish between live action images and animation images. Using movie stills to represent the two different movies would also create a large net for color-vision deficiency. Since movie stills of the live action typically have several different colors like: the environment, the clothing, and the skin tone; there is a greater chance a person could tell which portion of the visualization was for the live action and which was for the animation. This is furthered by the use of movie stills, as the images from live action movies, except a movie like The Lion King, would very obviously be a picture of a person versus a cartoon. 

User Experience Research

      The first round of UX research was a think aloud with only a single participant. Originally, I had created a version of the Box Office Sales graph that replaced the movie stills with a colored image of the Disney castle logo and a black and white version. I was unsure about this iteration, because the black and white castle did not have a clear enough image. I sent the visual to a participant for a think aloud and spoke with them over a Zoom call. The resounding consensus was that it was too confusing, and both images contained a confusing gradient and lack of line in the majority of the bars in the graph, to depict a clear image or accurate information. This version of Box Office Sales was scraped, and it was redone to look like the others.

      With the finished visualizations, I created a survey using Survey Monkey. My original goal was to see if these visuals inspired people to watch the movies, so many of my survey questions were about the information depicted. Since the oldest movie is Dumbo which premiered in 1941, and the newest movie, the live action Mulan, which premiered streaming this year, I did not require that the movie was seen when it premiered or in theaters. I planned to use at least one participant in each age group: young adult, adult, and senior. There were a total of  seven questions and a total of seven participants. The questions and results were as follows:

1. Have you seen any of these movies?

  1. All of them (14.29%)
  2. None of them (0%)
  3. Mostly animations (71.43%)
  4. Mostly remakes (0%)
  5. A Few (14.29%)

2. Did you learn any new information?

  1. Yes (85.71%)
  2. No (14.29%)

3. Did the information inspire you to watch any of the originals?

  1. Yes (28.57%)
  2. No (71.43%)

4. What about one of the remakes?

  1. Yes (42.86%)
  2. No (57.14%)

5. Were the graphics distracting or pleasing?

  1. Distracting (0%)
  2. Pleasing (71.43%)
  3. Unsure (28.57%)

6. Was it easy to understand?

  1. Yes (100%)
  2. No (0%)

7. What is your age?

  1. 18-25 (57.14%)
  2. 26-35 (14.29%)
  3. 36-45 (14.29%)
  4. 46-55 (0%)
  5. 56-64 (14.29%)
  6. 65+ (0%)

      In addition to the Survey, I asked four of the participants to partake in a think aloud exercise, similar to the first participant. Before these participants saw the visualizations, I called them over the phone and asked them to speak aloud their thoughts as they observed them. The age range of these participants varied between 18 and 59. The comments were generally very positive. They tended towards the subject matter of the visualizations, which were frequently about which was their favorite movie, or pointing out which they hadn’t seen yet. Two comments arose from the think alouds that were about the visualizations themselves; “Just my personal view; like whenever I saw something in black and white, I would assume there is some negative tone about this thing” and “ It took me a moment to understand the overlay bar charts, I think the side-by-side format of the last chart is the most clear.” In accordance with best practices, I did not ask the think aloud participants any questions until they answered the survey after the activity. On occasion I answered specific and non-leading questions such as “Does it matter I haven’t seen the Mulan remake?” The answer to which was no. Overall, very positive responses.

Findings

Box Office Sales

      My first findings were those associated from the data itself. From the first visualization I noted that The Lion King 2019 had the highest box office sales. Analyzing the content of this movie might suggest that this might have been due to the big names associated with the flim, such as Beyonce and Donald Glover. Remakes tended to have higher box office sales for those movies that kept much of the original score, like Beauty and The Beast 2017, Jungle Book 2016, Aladdin 2019 , and The Lion King 2019. Remakes like Mulan 2020 and 101 Dalmatians 1996 both removed the musical aspect of their movies completely, and in terms of Mulan 2020 even removed entire comedic characters. Mulan 2020 has one of the lowest box office sales, which could have been attributed to removing the music and some characters, the fact that it did not open in theaters and instead was a part of streaming on Disney+ with an added fee, or the political controversy surrounding the actress who played the title role. During the time of its release, Hong Kong was in protest, and the actress tweeted in support of the Hong Kong Police, which resulted in a call for a boycott of the film. 

Critic Score

      For the Critic Score visualization I was shocked to find the animated Dumbo 1941 had a 98% critic score on Rotten Tomatoes. Overall, the original animations had a higher critic rating, with the exception of only The Jungle Book 2016. Mulan 2020 had a shockingly high critic score, and Aladdin 2019 had a low critic score. The critic scores did not seem to be affected by the removal of music or characters.

Audience Score

      The Audience Score visualization held two of the live action remakes highly, Aladdin 2019 and The Jungle Book 2016. The overall scores for every single film, animation and remake, were lower than that of the critics scores. The highest rated film was the Aladdin 2019 live action remake, second to which is the original The Lion King 1994. The Mulan 2020 remake is right where I think it belongs a bit below 50% with a higher score for its original animation.

Percent Difference

     The last of the visualizations shows the percent difference between the audience and critic scores of both animation and live action remakes. I think it fair to say that just because it has a higher percent difference does not mean it is a bad movie, as some UX participants thought. The highest percent differences belonged to some of the most recent live action remakes, Aladdin 2019,  The Lion King 2019, and Mulan 2020. The Aladdin and Mulan remakes seem to be on opposite sides of the spectrum, with Aladdin having a low critic score and high audience score, and Mulan just the opposite. 

Graphs created by Survey Monkey

      My findings for the user experience research leaned on the positive end. Most of those who were surveyed had seen mostly the animations, and all but one had learned more information. As many had answered that they had seen most of the animations, the same 5 participants also answered that they were not inspired to watch any of the original animations. This was a fault of my own survey, as I should have asked if they had seen all of the animations listed. It created a flaw in my data because of this question, just for the animations, but still a flaw all the same. As for the remakes, the answers were almost evenly split between being inspired to watch remakes and not inspired to. The data I collected from the think alouds might suggest that this was because the final graph that displayed the percent difference may have been seen as a negative thing. This split in answer makes the answer to my question either inconclusive, or tell me that it does not matter to most people if the critic and audience scores are completely different. Three people had an interest in the remakes, but four people did not, regardless or maybe because of the information. Most people in the survey found the graphs to be pleasing and easy to understand. 

      In terms of using two methods of user experience research I think it helped the project immensely. Had I not done a first round of think alouds when I doubted what I was doing was clear enough, I would have had to redo everything at a much later stage. The second round of think alouds with my later iterations helped me understand the survey answers better as well as which type of graph functioned best. Without this type of research, I would have never heard a comment about the layout of the Percent Difference compared to the rest, and I wouldn’t have gotten that information from my survey. It also opened two way conversations; they were able to give me detailed information, and I was able to answer any question they might have had about their participation. 

Recommendations

      Originally the plan for this project was proposed to end in 3 visualizations. It resulted in four different graphs, but after the UX and writing this report, I believed I may have proved my point with two graphs. In theory, the Box Office Sales for a movie should be a clear indication of its popularity, and this could then be compared to the graph of Percent Difference Between Critic and Audience Score. If the findings of the Aladdin remake might be used as an example, it is the movie with the highest percent difference and has the third highest box office sales. Meaning, this live action remake held the most controversy between critics and audience ratings, yet it was still extremely popular and did better in theaters than most of the other movies. All this to say, in the end I had made four graphs, but maybe I would have done just as well with just two.

      In addition, the findings from the UX research that voted the Percent Difference graph the best may sway the layout choice for future iterations. The clarity this graph offered from removing the overlaid bars and adding a verbal label to the original and remake visuals made it the most popular graph. I had worried it used too many labels, but in the end another version of all of the graphs might function better and be more clear with this layout.