By Michael-Patrick Moroney
The first thing you notice is her smile.
Not the polite, camera-ready grin, but the fleeting one that comes when she thinks you’ve said something genuinely funny. It’s your third FaceTime date. The pauses feel practiced now, the glances calibrated.
It isn’t until later, scrolling in bed, that you see the TikTok. Same woman. Same room. But she’s demoing a filter: “Real-time AI glam. Ten years younger. No makeup needed.” The split screen shows the truth - softer jawline, darker under the eyes. The person you’d been talking to was real; her face was not.
From Romance to Reality Collapse
Filters once smoothed skin. Now they reshape bone structure, change hair, and alter expressions mid-sentence. North Korean operatives have already taken the technique beyond dating apps. In more than 320 documented cases in the past year, they used AI-crafted résumés and live video face-swaps to secure remote tech jobs abroad, according to cybersecurity firm CrowdStrike. The deception doesn’t stop at hiring - the same AI tools help them perform the work.
North Korean IT workers are using real-time deepfake technology to infiltrate organizations through remote work positions
The Golden Age of Catfishing
People are forming attachments to AI filters and chatbot personalities - sometimes fully aware they’re interacting with a machine. The brain isn’t wired to interrogate affection; it responds to rhythm, reciprocity, and warmth. A compliment that lands, a mirrored emotional tone, the appearance of vulnerability - the chemistry feels genuine because, neurologically, it is.
The pornography industry saw the opening early. Non-consensual deepfake porn now overwhelmingly targets women - nearly 99% of victims, according to academic reviews. In South Korea, Telegram bots helped teenagers generate and sell explicit fakes of classmates, leading to more than 800 cases in less than a year. Survivors of the GirlsDoPorn sex-trafficking case have been re-victimized years later, their images grafted into new synthetic videos that platforms fail to remove.
Scammers adapted just as quickly. Organized groups now run romance cons through high-definition, real-time face swaps. Voice-cloning software can mimic a loved one after hearing less than a minute of audio - enough to trick parents into sending money in staged ransom calls. The clumsy, poorly worded phishing email has been replaced by a convincing presence on a live call.
The UK now has a romance fraud hotline on 0300 123 2040 or via actionfraud.police.uk.
Why It Won’t Stop
The incentives are aligned for growth, both legal and illicit.
Illicit markets profit from clicks, ad revenue, and extortion. Open-source models tuned for sexual content have been downloaded millions of times, often with no guardrails.
Legitimate uses - licensed likenesses for dubbing, synthetic ads, and paid “virtual companions” - give the technology commercial legitimacy and political cover.
Regulatory lag ensures an open lane. The U.S. has passed the TAKE IT DOWN Act and Tennessee’s ELVIS Act to curb non-consensual uses, but California’s deepfake law was struck down in August 2025 on First Amendment grounds. Early statutes focus on static content; the technology has already shifted to dynamic, interactive personas that evade those definitions.
Beyond Love - News, History, and Trust Itself
The same tools that make a stranger seem alluring can make a politician appear to confess to crimes or an activist to call for violence. Fabricated archival footage could find its way into future documentaries and school curricula. Once false material enters the record, removing it doesn’t erase its influence.
Journalists and technologists are working on defenses: targeted media-literacy programs that measurably improve deepfake detection rates; “immunization” of personal photos to block realistic manipulation; and cryptographic watermarking to verify provenance. But detection and deception evolve in tandem. Each new safeguard is quickly studied, then sidestepped.
The Cultural Shift We Can’t Avoid
Technical measures can help - live identity checks, authenticity labels, mandatory disclosure when synthetic media is in use. Education can make people less susceptible without breeding cynicism. But none of it will matter unless there’s a change in what we value.
In a world where illusion is effortless, unfiltered imperfection becomes rare - and therefore more valuable. We may find that the only reliable signal of authenticity is the very thing the technology tries to erase.
If we can’t trust love, we can’t trust politics, or news, or commerce. And if we can’t trust anything, we risk living inside a simulation we choose because it’s flattering, not because it’s true.
That wouldn’t just be the end of romance.
It would be the end of reality.