Figures are blurred as they walk through an office space.

‘Not Black People-Friendly’: Viral LinkedIn Headshot Generator Once Again Shows AI’s Extreme Bias Problems

Users who engaged with Remini, an artificial intelligence (AI) photo enhancer, to generate professional headshots for their LinkedIn profiles have been reminded once again of the extreme bias and other issues around AI art and photo generation. AI’s propensity for generating photos that perpetuate the sexualization of women, racial stereotypes, and body dysmorphia is not a new issue but little seems to have been done to remedy the issue.

Recommended Videos

Advances in AI have led to users having ever-increasing access to systems that can easily generate AI art and photos. Users can insert text or a photo prompt into the system and see their ideas come to life because these AI art generators have been fed hundreds of millions of images and corresponding captions to be able to translate text to imagery and recognize objects, styles, and patterns in the dataset. They use what they learn from pre-existing datasets to generate and enhance new photos.

The problem (rather, one of many problems), is that these datasets are not free from bias—far from it. Some of these models might be trained specifically on datasets that lack diversity and representation, or they might be using public images from the internet with no way to filter out the racism and sexism that often permeates the global network. Even with this knowledge, many users are still shocked by just how sexist, racist, and generally biased these systems can be.

Users horrified by biased AI headshot generators

https://www.tiktok.com/@mikefromdc/video/7260253952019549482?is_from_webapp=1&sender_device=pc&web_id=7259654655100306986

The AI photo enhancer Remini has been going viral after users recently claimed to have found a professional LinkedIn headshot hack. Users have been uploading basic selfies or headshots, combined with a model photo of something like a person in a business suit, to generate enhanced photos for their LinkedIn accounts. Some users see this as a cheaper and easier alternative to Photoshop or hiring a professional photographer. However, the hack isn’t working for everyone.

TikToker user @mikefromdc created the video above, showing the bizarre, distinctly “not Black people-friendly” results he received from Remini, which noticeably lightened his skin and, for some reason, gave him a durag, even though he wasn’t wearing one in any of the photos he submitted for reference.

The results are potentially even more horrifying for POC women who try to use the generator. Artist Lana Denina shared her results on Twitter, noting that even though she used simple facial selfies for reference, Remini churned out hypersexualized full-body AI-generated images. The images showed substantial cleavage, and one photo featured a woman wearing an open blazer with no shirt underneath. Denina stated that she was “horrified” that all the AI generator needed was her facial features to exhibit its bias.

In the replies to Denina’s tweet, another woman revealed that she got similar results. She also shared the photos she submitted, which included a shot of her fully clothed from the shoulders up and a model photo of a woman in businesswear. However, once again, the system generated a photo of a woman in a blazer without an undershirt, leaving the user bewildered about how the system generated that from two professional photos of women.

https://twitter.com/megantheeponie/status/1680702976564908032?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1680702976564908032%7Ctwgr%5E26f08a37307bb9df71a8443bcb9ac9f80e540a1a%7Ctwcon%5Es1_&ref_url=https%3A%2F%2Fwww.dailydot.com%2Firl%2Fai-headshots-racist%2F

Again, problems with AI hypersexualizing women, and especially women of color, are not new, but these examples were especially shocking. These users set a professional tone for the photo with their model and reference images, setting appropriate parameters that AI should’ve been able to utilize effectively. One can imagine how jarring it is to submit a simple photo of one’s face and request a professional headshot in return, only to receive full-body shirtless, hypersexualized images instead. This seems to echo the results of a study from 2021 that found that sexist AI is even more sexist than some may have thought. Even when these systems are receiving “corporate templates” and non-sexual photos, the mere presence of Black feminine facial features results in a sexualized photo.

Remini highlights a multitude of issues that exist around AI-generated photos. Even when awareness is spreading of the bias in these systems, it can still be harmful. In most cases, these generators are marketed as “enhancing” photos and making them look better. It’s easy to see where the harm comes in when users receive photos back with Caucasian features, larger cleavage, and slimmer bodies and are told that these are what are considered better and preferable. Some users already noted that they were using their AI-generated photos as motivation to lose weight.

Users need to understand that AI photo generators like Remini are not neutral entities. They are extremely biased and the images they offer should definitely not be viewed as realistic, let alone aspirational.

(featured image: piranka/Getty Images)


The Mary Sue is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more
related content
Read Article Surprising No One, All 3,878 of Elon Musk’s Cybertrucks Are Being Recalled
Elon Musk during a T-Mobile and SpaceX event
Read Article ‘Mamma Mia!’ Star Sara Poyzer Says a BBC Production Replaced Her With AI
Sara Poyzer performs at the Magic at the Musicals event in 2019
Read Article In Moment of Unbelievable Irony, Midjourney Accuses Stability AI of Image Theft
Spider-Man pointing at another Spider-Man, who is pointing back.
Read Article Elon Musk May Be the Lesser of Two Evils in This Legal Battle With OpenAI
Elon Musk at the 2022 Met Gala
Read Article A.I. Scammers Are Impersonating Real Authors to Sell Fake Books
A robotic hand holds a pencil.
Related Content
Read Article Surprising No One, All 3,878 of Elon Musk’s Cybertrucks Are Being Recalled
Elon Musk during a T-Mobile and SpaceX event
Read Article ‘Mamma Mia!’ Star Sara Poyzer Says a BBC Production Replaced Her With AI
Sara Poyzer performs at the Magic at the Musicals event in 2019
Read Article In Moment of Unbelievable Irony, Midjourney Accuses Stability AI of Image Theft
Spider-Man pointing at another Spider-Man, who is pointing back.
Read Article Elon Musk May Be the Lesser of Two Evils in This Legal Battle With OpenAI
Elon Musk at the 2022 Met Gala
Read Article A.I. Scammers Are Impersonating Real Authors to Sell Fake Books
A robotic hand holds a pencil.
Author
Rachel Ulatowski
Rachel Ulatowski is an SEO writer for The Mary Sue, who frequently covers DC, Marvel, Star Wars, YA literature, celebrity news, and coming-of-age films. She has over two years of experience in the digital media and entertainment industry, and her works can also be found on Screen Rant and Tell-Tale TV. She enjoys running, reading, snarking on YouTube personalities, and working on her future novel when she's not writing professionally. You can find more of her writing on Twitter at @RachelUlatowski.