‘I don’t feel flattered’: A Fashion Brand Steals a Model’s Looks to Generate an AI Commercial

Artificial intelligence used to steal people’s likeness for unauthorized commercial production should be a dystopic comic or TV series episode. But in the year 2026, it’s a common crime with negligible legal repercussions.
TikTok model vanellimelli030, or simply Mel, posted a video about a fashion company stealing her likeness to create a full, AI-generated video advertisement without her consent. Mel claims that a friend sent the video to her, which she says felt “very, very familiar.”
How could it not feel familiar when the model in the advertisement looks exactly like her? Mel talked about the details of the AI model. She noted that the company changed the AI model’s hair to closely resemble her mullet. Even the AI model’s facial features mimicked Mel’s.
Needless to say, Mel was speechless. Her caption reads, “They say imitation is the highest form of flattery, but I don’t feel flattered.” Her post continues, “This can’t be the future and should not be normalized.”
Looming concerns over copyright
Social media users were sympathetic to Mel and felt disgruntled at the fashion label’s blatant theft. Some TikTok users were curious about which fashion label did this, but the brand remains censored in Mel’s video.
Many agree that Mel should sue the company that stole her image, only to make an AI-generated advertisement. One user writes, “Sue them for using your likeness. That’s outrageous.” But there were also those in the comment section who were joking about ‘copyrighting’ their faces.
Generative AI, at its current stage, is unregulated—this allows companies to use the technology to keep extracting information from people against their will. It’s new technology, and nobody has all the right answers regarding how to approach it. At the core of the issue, AI should be used to center humanity—but without applicable and firm regulations, companies using generative AI to steal from artists will continue to be an ongoing theme in the next few years.
Have a tip we should know? [email protected]