Do nearly all Indian men wear turbans? Generative AIs seem to think so, and it’s only the tip of the AI bias iceberg


While bias in generative AI is a well-known phenomenon, it’s still surprising what kinds of biases sometimes get unearthed. TechCrunch recently ran a test using Meta’s AI chatbot, which launched in April 2024 for over a dozen countries including India, and found an odd and disturbing trend.
When generating images using the prompt “Indian men,” the vast majority of the results feature said men wearing turbans. While a large number of Indian men do wear turbans (mainly if they’re practicing Sikhs), according to the 2011 census, India’s capital city Delhi has a Sikh population of about 3.4%, while the generative AI image results deliver three to four out of five men.
Unfortunately, this isn’t the first time generative AI has been caught up in a controversy related to race and other sensitive topics, and this is far from the worst example either.
How far does the rabbit hole go?
In August 2023, Google’s SGE and Bard AI (the latter now called Gemini) were caught with their pants down arguing the ‘benefits’ of genocide, slavery, fascism, and more. It also listed Hitler, Stalin, and Mussolini on a list of “greatest” leaders, with Hitler also making its list of “most effective leaders.”
Later on that year in December 2023, there were multiple incidents involving AI, with the most awful of them including Stamford researchers finding CSAM (child abuse images) in the popular LAION-5B image dataset that many LLMs train on. That study found more than 3,000 known or suspected CSAM images in that dataset. Stable diffusion maker Stability AI, which uses that set, claims that it filters out any harmful images. But how can that be determined to be true — those images could easily have been incorporated into more benign searches for ‘child’ or ‘children.’
There’s also the danger of AI being used in facial recognition, including and especially with law enforcement. Countless studies have already proven that there is clear and absolute bias when it comes to what race and ethnicity are arrested at the highest rates, despite whether any wrongdoing has occurred. Combine that with the bias that AI is trained on from humans and you have technology that would result in even more false and unjust arrests. It’s to the point that Microsoft doesn’t want its Azure AI being used by police forces.
It’s rather unsettling how AI has quickly taken over the tech landscape, and how many hurdles remain in its way before it advances enough to be finally rid of these issues. But, one could argue that these issues have only arisen in the first place due to AI training on literally any datasets it can access without properly filtering the content. If we’re to address AI’s massive bias, we need to start properly vetting its datasets — not only for copyrighted sources but for actively harmful material that poisons the information well.
You might also like
While bias in generative AI is a well-known phenomenon, it’s still surprising what kinds of biases sometimes get unearthed. TechCrunch recently ran a test using Meta’s AI chatbot, which launched in April 2024 for over a dozen countries including India, and found an odd and disturbing trend. When generating images…
Recent Posts
- Top digital loan firm security slip-up puts data of 36 million users at risk
- Nvidia admits some early RTX 5080 cards are missing ROPs, too
- I tried ChatGPT’s Dall-E 3 image generator and these 5 tips will help you get the most from your AI creations
- Gabby Petito murder documentary sparks viewer backlash after it uses fake AI voiceover
- The quirky Alarmo clock is no longer exclusive to Nintendo’s online store
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010