CIELAB Scale Reveals Bias in Skin Color Algorithms: A group of researchers at Sony have taken a closer look at the bias that exists in image algorithms when it comes to skin color representation. They found that current skin color scales, such as the Fitzpatrick scale, fail to capture the true diversity of human skin. By using the CIELAB scale, which is a more detailed and accurate international color standard, they discovered that the algorithms favor redder skin tones. This means that individuals with more yellow hues, such as those from East Asia, South Asia, Latin America, and the Middle East, are underrepresented in the final images produced by these algorithms.
The Sony researchers proposed a new method to represent skin color, using two coordinates instead of a single number. This approach takes into account both the depth of color and the gradation of hue, allowing for a more comprehensive understanding of skin diversity. They tested the new method on various data sets and AI systems and found significant bias in popular systems, such as CelebAMask-HQ and FFHQ. These biases also affected facial recognition programs and smile detection tools.
The study’s findings raise concerns about the potential disadvantages faced by individuals from populations with underrepresented skin tones. By addressing these biases, algorithms can be more accurate and inclusive. However, more work needs to be done to improve measures and reduce biases. It is crucial to continue making progress in this area to ensure fairness and equality in AI systems.
The issue of bias in skin color algorithms highlights the need for ongoing research and development. As Alice Xiang, global head of AI Ethics at Sony, points out, if products are only evaluated in a one-dimensional way, biases will go undetected and unmitigated. Different measures may be necessary, depending on the situation.
In conclusion, the Sony researchers have shed light on the bias that exists in image algorithms when it comes to skin color representation. By using the CIELAB scale, they have discovered significant biases in various AI systems that favor redder skin tones. This underrepresentation of individuals with more yellow hues raises concerns about fairness and equality. Efforts to develop additional and improved measures are essential to address these biases and create more accurate and inclusive AI systems.
Question for the reader: What do you think about the biases in skin color algorithms and the efforts to address them? Leave your comments below!
IntelliPrompt curated this article: Read the full story at the original source by clicking here