AI bias tests gloss over a crucial aspect of skin color, Sony research claims


While the AI industry has focused on making its algorithms less biased based on the lightness or darkness of people’s skin tones, new research from Sony is calling for red and yellow skin hues to also be taken into account. In a paper published last month, authors William Thong and Alice Xiang from Sony AI, as well as Przemyslaw Joniak from the University of Tokyo, put forward a more “multidimensional” measurement of skin color in the hope that it might lead to more diverse and representative AI systems.
Researchers have been drawing attention to skin color biases in AI systems for years, including in an important 2018 study from Joy Buolamwini and Timnit Gebru that found AI was more prone to inaccuracies when used on darker-skinned females. In response, companies have stepped up efforts to test how accurately their systems work with a diverse range of skin tones.
The problem, according to Sony’s research, is that both scales are primarily focused on the lightness or darkness of skin tone. “If products are just being evaluated in this very one-dimensional way, there’s plenty of biases that will go undetected and unmitigated,” Alice Xiang, Sony’s global head of AI Ethics tells Wired. “Our hope is that the work that we’re doing here can help replace some of the existing skin tone scales that really just focus on light versus dark.” In a blog post, Sony’s researchers specifically note that current scales don’t take into account biases against “East Asians, South Asians, Hispanics, Middle Eastern individuals, and others who might not neatly fit along the light-to-dark spectrum.”
As an example of the impact this measurement can have, Sony’s research found that common image datasets overrepresent people with skin that’s lighter and redder in color, and underrepresent darker, yellower skin. This can make AI systems less accurate. Sony found Twitter’s image-cropper and two other image-generating algorithms favored redder skin, Wired notes, while other AI systems would mistakenly classify people with redder skin hue as “more smiley.”
Sony’s proposed solution is to adopt an automated approach based on the preexisting CIELAB color standard, which would also eschew the manual categorization approach used with the Monk scale.
Although Sony’s approach is more multifaceted, part of the point of the Monk Skin Tone Scale — which is named after creator Ellis Monk — is its simplicity. The system is intentionally limited to 10 skin tones to offer diversity without risking the inconsistencies associated with having more categories. “Usually, if you got past 10 or 12 points on these types of scales [and] ask the same person to repeatedly pick out the same tones, the more you increase that scale, the less people are able to do that,” Monk said in an interview last year. “Cognitively speaking, it just becomes really hard to accurately and reliably differentiate.”
Monk also pushed back against the idea that his scale doesn’t take undertones and hue into account “Research was dedicated to deciding which undertones to prioritize along the scale and at which points,” he tells Wired.
Nevertheless, Wired reports that a couple of major AI players have welcomed Sony’s research, with both Google and Amazon noting that they’re reviewing the paper.
While the AI industry has focused on making its algorithms less biased based on the lightness or darkness of people’s skin tones, new research from Sony is calling for red and yellow skin hues to also be taken into account. In a paper published last month, authors William Thong and…
Recent Posts
- Elon Musk’s AI said he and Trump deserve the death penalty
- The GSA is shutting down its EV chargers, calling them ‘not mission critical’
- Lenovo is going all out with yet another funky laptop design: this time, it’s a business notebook with a foldable OLED screen
- Elon Musk’s first month of destroying America will cost us decades
- The first iOS 18.4 developer beta is here, with support for Priority Notifications
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010