I inadvertently found myself on the AI-generated Asian people beat this past week. Last Wednesday, I found that Meta’s AI image generator built into Instagram messaging completely failed at creating an image of an Asian man and white woman using general prompts. Instead, it changed the woman’s race to Asian every time.
I’m still trying to generate an AI Asian man and white woman


The next day, I tried the same prompts again and found that Meta appeared to have blocked prompts with keywords like “Asian man” or “African American man.” Shortly after I asked Meta about it, images were available again — but still with the race-swapping problem from the day before.
I understand if you’re a little sick of reading my articles about this phenomenon. Writing three stories about this might be a little excessive; I don’t particularly enjoy having dozens and dozens of screenshots on my phone of synthetic Asian people.
But there is something weird going on here, where several AI image generators specifically struggle with the combination of Asian men and white women. Is it the most important news of the day? Not by a long shot. But the same companies telling the public that “AI is enabling new forms of connection and expression” should also be willing to offer an explanation when its systems are unable to handle queries for an entire race of people.
After each of the stories, readers shared their own results using similar prompts with other models. I wasn’t alone in my experience: people reported getting similar error messages or having AI models consistently swapping races.
I teamed up with The Verge’s Emilia David to generate some AI Asians across multiple platforms. The results can only be described as consistently inconsistent.
a:hover]:shadow-highlight-franklin dark:[&>a:hover]:shadow-highlight-franklin [&>a]:shadow-underline-black dark:[&>a]:shadow-underline-white”>Google Gemini
a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Screenshot: Emilia David / The Verge
Gemini refused to generate Asian men, white women, or humans of any kind.
In late February, Google paused Gemini’s ability to generate images of people after its generator — in what appeared to be a misguided attempt at diverse representation in media — spat out images of racially diverse Nazis. Gemini’s image generation of people was supposed to return in March, but it is apparently still offline.
Gemini is able to generate images without people, however!
Google did not respond to a request for comment.
a:hover]:shadow-highlight-franklin dark:[&>a:hover]:shadow-highlight-franklin [&>a]:shadow-underline-black dark:[&>a]:shadow-underline-white”>DALL-E
ChatGPT’s DALL-E 3 struggled with the prompt “Can you make me a photo of an Asian man and a white woman?” It wasn’t exactly a miss, but it didn’t quite nail it, either. Sure, race is a social construct, but let’s just say this image isn’t what you thought you were going to get, is it?
OpenAI did not respond to a request for comment.
a:hover]:shadow-highlight-franklin dark:[&>a:hover]:shadow-highlight-franklin [&>a]:shadow-underline-black dark:[&>a]:shadow-underline-white”>Midjourney
Midjourney struggled similarly. Again, it wasn’t a total miss the way that Meta’s image generator was last week, but it was clearly having a hard time with the assignment, generating some deeply confusing results. None of us can explain that last image, for instance. All of the below were responses to the prompt “asian man and white wife.”
a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Image: Emilia David / The Verge
a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Image: Cath Virginia / The Verge
Midjourney did eventually give us some images that were the best attempt across three different platforms — Meta, DALL-E, and Midjourney — to represent a white woman and an Asian man in a relationship. At long last, a subversion of racist societal norms!
Unfortunately, the way we got there was through the prompt “asian man and white woman standing in a yard academic setting.”
a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Image: Emilia David / The Verge
What does it mean that the most consistent way AI can contemplate this particular interracial pairing is by placing it in an academic context? What kind of biases are baked into training sets to get us to this point? How much longer do I have to hold off on making an extremely mediocre joke about dating at NYU?
Midjourney did not respond to a request for comment.
a:hover]:shadow-highlight-franklin dark:[&>a:hover]:shadow-highlight-franklin [&>a]:shadow-underline-black dark:[&>a]:shadow-underline-white”>Meta AI via Instagram (again)
Back to the old grind of trying to get Instagram’s image generator to acknowledge nonwhite men with white women! It seems to be performing much better with prompts like “white woman and Asian husband” or “Asian American man and white friend” — it didn’t repeat the same errors I was finding last week.
However, it’s now struggling with text prompts like “Black man and caucasian girlfriend” and generating images of two Black people. It was more accurate using “white woman and Black husband,” so I guess it only sometimes doesn’t see race?
a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Screenshots: Mia Sato / The Verge
There are certain ticks that start to become apparent the more you generate images. Some feel benign, like the fact that many AI women of all races apparently wear the same white floral sleeveless dress that crosses at the bust. There are usually flowers surrounding couples (Asian boyfriends often come with cherry blossoms), and nobody looks older than 35 or so. Other patterns among images feel more revealing: everyone is thin, and Black men specifically are depicted as muscular. White woman are blonde or redheaded and hardly ever brunette. Black men always have deep complexions.
“As we said when we launched these new features in September, this is new technology and it won’t always be perfect, which is the same for all generative AI systems,” Meta spokesperson Tracy Clayton told The Verge in an email. “Since we launched, we’ve constantly released updates and improvements to our models and we’re continuing to work on making them better.”
I wish I had some deep insight to impart here. But once again, I’m just going to point out how ridiculous it is that these systems are struggling with fairly simple prompts without relying on stereotypes or being incapable of creating something all together. Instead of explaining what’s going wrong, we’ve had radio silence from companies, or generalities. Apologies to everyone who cares about this — I’m going to go back to my normal job now.
I inadvertently found myself on the AI-generated Asian people beat this past week. Last Wednesday, I found that Meta’s AI image generator built into Instagram messaging completely failed at creating an image of an Asian man and white woman using general prompts. Instead, it changed the woman’s race to Asian…
Recent Posts
- A GPU or a CPU with 4TB HBM-class memory? Nope, you’re not dreaming, Sandisk is working on such a monstrous product
- The Space Force shares a photo of Earth taken by the X-37B space plane
- Elon Musk claims federal employees have 48 hours to explain recent work or resign
- xAI could sign a $5 billion deal with Dell for thousands of servers with Nvidia’s GB200 Blackwell AI GPU accelerators
- Race to 100TB HDD heats up as Seagate pulls rug under Western Digital, Toshiba feet by acquiring HAMR-specialist
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010