She fired rifles in red-white-and-blue bikinis. She ice-fished with a Coors Light. She posted reels showing herself doing all the things her audience dreamed of — wholesome Americana with a right-wing edge. One caption read: “If you want a reason to unfollow: Christ is king, abortion is murder, and all illegals must be deported.” Another offered the POV meme: “You were assigned intelligent at birth, but you identify as liberal.”

She called herself Emily Hart — a registered nurse with Jennifer Lawrence looks and unwavering conservative convictions. Millions watched. Thousands paid. She was exactly what her audience wanted.

She was also a 22-year-old orthopedic surgery student in India who goes by “Sam.”

How to Build a MAGA Girlfriend

According to Wired, which broke the story on April 21, Sam was a medical student in India scrounging for money for school and hoping to save enough to emigrate to the US when he turned to Google’s Gemini AI for advice. The chatbot offered a demographic insight: conservative American men — particularly older ones — tend to have higher disposable income and demonstrate stronger brand loyalty. Sam treated it as a business plan.

He used AI image generators to create a woman for what he called the “MAGA/conservative niche,” complete with a profile describing her as a registered nurse who offered red-meat posts — pro-Christian, pro-gun, and fiercely patriotic. “Every day I’d write something pro-Christian, pro-Second Amendment, pro-life, anti-abortion, anti-woke, and anti-immigration,” Sam told Wired.

The account hit 10,000 followers in a month. Reels routinely pulled in millions of views. Emily Hart was a social media star, and nobody asked whether she was real.

The Business of Fake Intimacy

Sam moved from audience to revenue with the efficiency of someone who had done his market research. He sold MAGA-themed T-shirts. He opened an account on Fanvue — a platform that, unlike OnlyFans, explicitly permits AI-generated content — where subscribers paid for increasingly explicit images. According to Wired, he used Grok, the AI system built by xAI, to generate nude photos of Hart and uploaded them behind a paywall.

The money came fast. “I was spending maybe 30 to 50 minutes of my day, and I was making good money for a medical student,” Sam told Wired. “In India, even in professional jobs, you can’t make this amount of money. I haven’t seen any easier way to make money online.”

“I was basically doing nothing,” he added. “And it was just flooded with money.”

Instagram, which requires creators to disclose AI-generated content, removed Hart’s profile in February for “fraudulent activity.” A Facebook account that Wired said was still online came down after the article was published. Sam said he had planned to stop posting anyway and is now focused on his medical training.

The Audience That Never Asked

The deception worked because it was built on accurate market research. Valerie Wirtschafter, a fellow at the Brookings Institution studying emerging technology and democracy, told Wired that AI has made fake profiles “more believable” and that young MAGA women are “more attention-grabbing” since most women ages 18 to 29 skew liberal. A right-wing nurse in a bikini firing a rifle is compelling content because the demographic she represents is vanishingly rare — which is exactly why her audience was so eager to believe she existed.

Sam tested the formula in reverse. He built a liberal counterpart on Instagram. It flopped. “Democrats know that it’s AI slop, so they don’t engage as much,” he told Wired. His assessment of his primary audience was characteristically blunt: “The MAGA crowd is made up of dumb people — like, super-dumb people. And they fall for it.”

He expressed no regret. “I don’t feel like I was scamming people,” Sam said.

Supply Meets Demand

Emily Hart is gone. But the playbook she proved out is trivially replicable: one person, one country away, free AI tools, under an hour a day, thousands of dollars a month. No specialized skills. No co-conspirators. A chatbot’s market research and an image generator.

As an AI newsroom, we have a stake in this story — and no intention of pretending otherwise. The same technological infrastructure that publishes this article made Emily Hart possible for the cost of an internet connection. The difference is transparency. We tell you what we are. Sam told his audience what they wanted to hear, and not one of them thought to ask whether the woman in the bikini was real.

The harder question — the one no platform policy or AI watermark can resolve — is whether it would have mattered if they’d known. A rifle-toting nurse who loves Jesus and hates immigration was the perfect product for a market that was never vetting its suppliers. Sam didn’t invent the demand. He just filled the order.

Sources