There's a particular kind of internet hustle that's both completely predictable and still somehow shocking when you see it laid out in detail. A medical student, according to a report from Wired, claims to have made thousands of dollars selling photos and videos of a young conservative woman who does not exist - created entirely using generative AI tools.
He's not the only one doing this. And the people buying? He describes them, bluntly, as "super dumb" men.

The anatomy of a digital con
The formula is straightforward to the point of being depressing. Take the aesthetic of a young, politically conservative woman - the kind of persona that performs well in certain corners of social media - generate convincing photos and video content using AI tools, and then sell access to that content to men who believe they're connecting with a real person.
It works because the AI-generated imagery has become genuinely difficult to distinguish from real photography at a glance. And it works because the buyers, for whatever reason, aren't asking too many questions.

The political framing matters here. The MAGA aesthetic isn't incidental - it's a targeting strategy. It signals a specific set of values and a specific community, which makes the fake persona more believable and more appealing to a particular audience. The scammer isn't just selling photos. He's selling an identity that feels familiar and desirable to the people he's after.
Why this matters beyond the obvious "AI bad" take
It would be easy to file this under "yet another AI misuse story" and move on. But there are a few things worth sitting with here.

First, the ease of it. This isn't a sophisticated operation requiring technical expertise. Generative tools have become accessible enough that someone can spin up a convincing fake human persona and monetize it with relatively little friction.
Second, the scale. This is one person talking to Wired. The report makes clear he's not alone, which means the actual footprint of this kind of scam is almost certainly much larger than one med student's side hustle.
And third - the trust problem this creates for real creators. When fake personas become this easy to produce and sell, it adds a layer of suspicion to every online interaction. Real people with real audiences now have to work harder to prove they're actually human.
The bigger picture
We're entering a moment where the question "is this person real?" is going to come up more and more - and not just in obvious scam contexts. The tools are too good, the incentives are too clear, and the guardrails are still catching up.
In the meantime, the advice is annoyingly simple: slow down, look closely, and maybe reconsider any transaction that started with a too-perfect face on your screen.





