LGBTQ2SIA+AI
AI is a mirror of who we boost on social media, not who we really are
Influencers James Charles and Gigi Gorgeous influence AI too
“Here's How Generative AI Depicts Queer People,” a recent article from Wired, reported on the experience of prompting Midjourney and DALL-E to generate LGBTQ2SIA+ representation. The resulting depictions were stereotypical: “Queer” and “Nonbinary” resulted in young and white with lilac hair; “Gay” or “Lesbian” resulted in the sides of their heads shaved, sprawling tattoos and stern expressions; “Transgender” was responded to with hyper-sexualized women. It’s a problem because these identities involve a range of races, ages, and hair colour. We want results that reflect reality.
But the AI generators aren’t inherently biased. Rather, they are deriving results from what’s been found online. So at what point in the process is the misrepresentation taking place? Could it be that the images tagged as such are of people who stereotypically depict themselves to gain followers and define their brand?
The top LGBTQ2SIA+ personalities include beauty influencer James Charles (20 million followers) who self-identifies as gay, and makeup artist Nikkie de Jager (19 million followers), and model Gigi Gorgeous (2.3 million followers)—both of whom self-identify as trans. Frankly, these influencers look like results criticized in Wired.
Filipino beauty influencer Bretman Rock currently has 18.8 million followers on Instagram. Last year, he offered this viewpoint in the Philippines edition of Vogue:
“I started seeing the lack of representation when I started in the beauty industry. There were a lot of, you know, white people. And they [dominated] the beauty community, so it was hard to see representation in that way. But girl, I ate that up, I’m still the bad bitch!”
Page embodies a look criticized for how AI misrepresents trans people
One of the most renowned figures in the trans community is Elliot Page, who has appeared on multiple magazine covers—from Time to Esquire to People—and has appeared in a perfume commercial for Gucci Guilty. In all these high-profile representations, including the cover of his own memoir, Page also embodies a look being criticized as cliché. The widespread attention is doubtlessly influencing AI.
Have you ever heard of Brian Michael Smith? I hadn’t until I was looking for non-white trans male actors. Smith’s online popularity pales in comparison to Page despite having publicly transitioned three years earlier and making subsequent appearances on Queen Sugar for the Oprah Winfrey Network, then as an openly trans character in 9-1-1: Lone Star on Fox, where he’s broken new ground for broadcast television.
But this still isn’t the level of fame that impacts artificial intelligence, even if Smith has stressed how important it is for him to promote diverse representation in media:
“In the last two years I’ve been able to do something I’ve wanted to do: bring very different trans characters into existence and in front of audiences.”
Still, a search for #elliotpage on Instagram garnered 54,576 results, compared to only 5,991 posts when I searched #brianmichaelsmith. Considering how Meta is scraping Instagram for data, you can see how if there are 10 times the postings of stereotypical trans people, there’s also going to be 10 times more stereotypical representation in AI-generated image results, even if the prompt is non-specific.
When I prompted “show me a trans person standing” on DALL-E, these are both of the results I got. Tell me if they look more like Elliott Page or Brian Michael Smith:
AI is not inherently biased, it’s influenced by the type of imagery prevalent online
Sure, you can find less stereotypical influencers, but they don’t have the follower count of their purple-haired counterparts. It seems we are accelerating the expectation for AI to generate something that does not exist. If we want to see diversity to result from our prompts, it begins with more realistic representation.