Yale professors confront racial bias in infographic
What stories need to be told? Yale professors address racial bias rooted in the algorithms that portray humans in infographics.
According to Yale computer science professor Theodore Kim, representation in animation is limited by the racially biased heritage of computer graphics technology.
Kim focuses on racial bias in the research behind computer-generated humans. From his previous work as a senior researcher at Pixar to his current work as a professor of computer science and co-director of the Yale Computer Graphics Group, Kim has considerable experience in this field. By raising awareness of racial bias in the history of computer graphics technology, Kim hopes to build a community dedicated to open discussion and confrontation of this systemic issue of representation.
“The idea that math and science provide an objective way to understand the world and ourselves is not a bulletproof guarantee,” Kim said, “but rather an ideal to be achieved.”
While interning at Rhythm and Hues Studio in 2001, Kim observed when underground broadcasting became synonymous with “skin”. The subterranean diffusion creates a translucency intended to simulate the effect of light hitting the skin. However, this luminous effect is only the dominant visual characteristic in young and white skin. Such translucency is much less important for adding realism to darker skin tones. According to Kim, the estate effectively cut out a most important piece of physique for white skin, in an effort to mimic the skin types that dominated commercials, magazines and movies in the late 1990s. He discussed of this common lighting technique in a talk at the SIGGRAPH 2021 conference.
For Kim, the classification of this algorithm as a method of generating “human skin” was synonymous with calling the pink bandages “flesh-colored”. He found that most journal articles on “skin rendering” included a single computer-generated white person when illustrating techniques intended to depict “humans”. Kim further raised concerns that such complex algorithms could realistically only be challenged by specialists of some status in the community – indeed, what has been established as “skin” does not could not be easily rewritten.
“We need to consider the multiple dimensions of diversity,” said computer science professor Holly Rushmeier, “and shouldn’t lump people together in groups with one group somehow holding a privileged position over the others.”
Rushmeier and fellow computer science professor Julie Dorsey have joined Kim in efforts to emphasize diversity in modeling humans. This team of computer graphics experts submitted a detailed abstract to SIGGRAPH 2021. Kim reflected on the ensuing controversy.
While five of the seven reviews were “overwhelmingly positive”, one was neutral and the other was “virulently negative”, which Kim also described as containing coded racist messages. Eventually, the final judge forced the summary to be thrown out.
“Technical subjects are usually taught as objectives and existing outside of the story,” Kim said. “When you start talking about how these topics can also encode racist assumptions, people can get very upset, especially those who have benefited the most from this perception of objectivity. Science is often presented as a refuge from the prejudices of the world, and if you’re the messenger saying it’s not really safe, you can paint yourself a target.
Kim had no problem being a messenger. In an opinion piece published in Scientific American, it drew attention to the use of white people’s skin and hair as the “default” in the development of technology for computer-generated humans. Kim referenced John Alton’s 1949 “Painting with Light,” a film lighting guideline book specifically designed for white skin types. These “human face” lighting techniques were then pushed into the digital age, being used in lighting studies in the world of computer graphics.
He further noted that “technological white supremacy” extends to human hair. The “Marschner” model was the standard model used for rendering hair, but was designed specifically to capture the interaction of light in flat, straight hair. This model was treated as a “pretty good giveaway” in the practice of portraying “human hair”, with no equivalent model developed for afro-textured kinky hair. Along the same lines, hair motion simulation algorithms have been built on the assumption that hair is made up of straight or wavy fibers, rather than “frizzy” hair.
“We are currently exploring human hair simulation methods that encompass all human hair types,” Kim said. “Not just those who adhere to a specific, historically biased standard of beauty.”
Rushmeier noted that research needs to be done to provide everyone with the tools they need to tell their story. AM Darke, another contributor, developed the Open Source Afro Hair Library, aiming to counter the bias in computer graphics technology that limits the ability to represent black characters. However, to combat all racial bias in infographics, a significant community effort will be required, Rushmeier pointed out.
Kim and Rushmeier led a session titled “Combating Racial Bias in Computer Graphics Requires Structural Change” at SIGGRAPH 2021 in an effort to convince others to join them in submitting extended abstracts for SIGGRAPH 2022. Rushmeier hopes for a future in which this is “not a niche topic” and the infographic community undergoes structural changes in terms of what is identified as important research topics and appropriate ways to conduct research.
“Scientists are not automatically imbued with objectivity just because they participate in these disciplines,” Kim said. “Science always holds the promise of providing a safe haven from bias and prejudice, but that promise can only be fulfilled if we specifically decide to keep it.”
Since her time at Pixar, Kim’s work has appeared in films such as Cars 3, Coco, Incredibles 2 and Toy Story 4.