The Real-Time Deepfake Romance Scams Have Arrived

Watch how smooth-talking scammers known as “Yahoo Boys” use widely available face-swapping tech to carry out elaborate romance scams.
Photoillustration of a man lit by a ring light with his face cutout and replaced by a woman's face on a cell phone
Photo-illustration: Jacqui VanLiew; Getty Images

The compliments start flowing as soon as she answers the video call. “Wow, you so pretty, honey,” says the man on the other side of the screen. His video feed shows he’s white, with short hair, likely a few years younger than her, and is sitting in front of his camera wearing a plaid shirt.

“You’re looking different with that beard and stuff gone,” the woman says in an American accent as the conversation gets going. The man doesn’t miss a beat. “I told you I was going to shave my beard so I will look good.”

Except, he isn’t who he claims to be. His videofeed is a lie. And—beard or not—the face the woman can see over the video call is not his: It’s a deepfake.

In reality, the man is a scammer using face-swapping technology to totally change his appearance in real time. In a video of the call—filmed by the scammer’s accomplice likely thousands of miles away from the woman—his real face can be seen on this laptop alongside the fake persona as he speaks to his victim.

This self-shot video is one of scores posted online by scammers known as Yahoo Boys, a loose collective of con artists, often based in Nigeria. The video reveals how they are using deepfakes and face-swapping to ensnare victims in romance scams, building trust with victims using fake identities, before tricking them into parting with thousands of dollars. More than $650 million was lost to romance fraud last year, the FBI says.

A Yahoo Boy scammer uses multiple phones and face-swapping software against a victim. It’s one of two techniques the group uses for real-time calls.

The Yahoo Boys have been experimenting with deepfake video clips for around two years and shifted to more real-time deepfake video calls over the last year, says David Maimon, a professor at Georgia State University and the head of fraud insights at identity verification firm SentiLink. Maimon has monitored the Yahoo Boys on Telegram for more than four years and shared dozens of videos with WIRED revealing how the scammers are using deepfakes.

A WIRED review of the videos and three associated Yahoo Boy Telegram channels shows how the con artists’ techniques have evolved as deepfake applications and artificial intelligence have improved. It is one of the first times the specific tactics and outlandish techniques of scammers using deepfake video calls has been documented in this detail.

The videos show Yahoo Boys using the technology on setups involving both laptops and phones. In multiple videos, the scammers often brazenly show their own faces, as well as those of the victims they are scamming. “I don't think they're doing this because they’re stupid,” Maimon says. “I think that they simply don’t care, and they’re not afraid of the repercussions.”

The Yahoo Boys are experienced scammers—and they openly brag about it. Photos and videos of their conning and recruitment can be found all across social media, from Facebook to TikTok. However, the cybercriminals, who have links back to Nigerian prince email scams, are arguably their most open on Telegram.

In groups containing thousands of members, Yahoo Boys organize and advertise their individual skills for a smorgasbord of scams. They’re skilled social manipulators, who can have long-lasting impacts on their victims. Business email compromise, crypto scams, and impersonation scams are all touted in hundreds of posts per day. Members claim to be selling photo and video editing skills and entire albums of explicit photographs that can be used to build a convincing persona. Fake IDs and legitimate-looking social media profiles are for sale. Scam “scripts” are free to download.

“The Yahoo Boys have elements of organized crime and disorganized crime,” says Paul Raffile, an intelligence analyst at the Network Contagion Research Institute, who has investigated Yahoo Boys sextorting teenagers and driving them towards suicide. “They don't have a leader, they don’t have a governance structure.” Rather, Raffile says, they organize in clusters and share advice and tips online. Telegram did not respond to WIRED’s request for comment about Yahoo Boys’ channels, but the three channels no longer appear to be accessible.

The digital con artists started using deepfakes as part of their romance scams around May 2022, says Maimon. “What folks were doing was just posting videos of themselves, changing their appearance, and then sending them to the victim—trying to lure them to talk to them,” he says. Since then, they’ve moved on.

To create their videos, the Yahoo Boys are using a handful of different software and apps. WIRED is not naming the specific software, to limit people’s ability to copy the attacks. However, the tools they are using are often advertised for entertainment purposes, such as allowing people to swap their faces with celebrities or influencers.

The Yahoo Boys’ live deepfake calls run in two different ways. In the first, shown above, the scammers use a setup of two phones and a face-swapping app. The scammer holds the phone they are calling their victim with—they’re mostly seen using Zoom, Maimon says, but it can work on any platform—and uses its rear camera to record the screen of a second phone. This second phone has its camera pointing at the scammer’s face and is running a face-swapping app. They often place the two phones on stands to ensure they don’t move and use ring lights to improve conditions for a real-time face-swap, the videos show.

The second common tactic—shown below—uses a laptop instead of a phone. (WIRED has blurred real faces in both videos.) Here, the scammer uses a webcam to capture their face and software running on the laptop changes their appearance. Videos of the setup show scammers are able to see their own face alongside the altered deepfake, with just the manipulated image being displayed over the live video call.

Maimon says he and his teams have tracked down some of the victims, both in deepfake videos and photo albums being sold by Yahoo Boys, and have tried to contact them. WIRED was not able to determine the real identities of victims or scammers, and it is not clear how many times deepfakes have been used. Still, the videos appearing to feature victims reveal how the scammers build rapport with their targets.

Videos show scammers telling people they “love” them and frequently complimenting their appearance. In one video, a scammer says they want to travel to Canada to meet their target, and when they do, they will pay them “instantly,” suggesting the target has sent them money that the scammer has falsely promised to pay back. Other videos contain deeply personal information, showing the levels of trust that scammers can create with the people they target. “Some victims, they talk about their eating disorders, they talk about their depression,” Maimon says.

The videos are likely only a snapshot of the Yahoo Boys’ romance-scamming activity, and many of the videos they self-publish are partly intended to show off their capabilities to allow others to “buy” the approach. Raffile says that, as he investigated the groups for sextortion, he noticed most face-swapping was being used for romance scams. However, there were also instances where Yahoo Boys claimed to use deepfake “nude” generators on photographs.

The scammers can run face-swapping software on laptops. It mimics their facial expressions and mouth movements.

Artificial intelligence will supercharge scams. While deepfake videos have been around for more than half a decade—mostly used for nonconsensual pornography—they’re often glitchy and easy to spot. Lips don’t sync up as people talk, and errors are relatively trivial to detect. However, the technology is quickly improving and becoming easier for anyone to use.

Some of the Yahoo Boy videos are unbelievable, obvious fakes, while others appear plausible. When they’re viewed live, on a mobile phone, with unstable connections, any obvious flaws may be masked—especially if a scammer has spent months social-engineering their victim.

Rachel Tobac, the cofounder and CEO of SocialProof security, who along with company CTO Evan Tobac reviewed a selection of Yahoo Boy videos for WIRED, says she has generally seen some face swapping being used in romance scams over the past 12 months. “They were not super convincing last year, when I checked them out. This year, a lot has changed,” Rachel Tobac says. “Especially the ones where they’re able to change the pitch of their voice and the look of their face—sometimes changing skin tone, hair, eye color, everything’s matched. It’s pretty wild.”

The Yahoo Boys appear to often use existing personas and faces built into the apps. The scammers largely talk using their own voices, although in a couple of videos this may have also been altered. In many of the videos seen by WIRED, the onscreen characters can turn their heads but not move their entire bodies. Their lips and facial expressions move with those made by the face of the scammer. The capabilities of these tools will only improve over time.

Andrew Newell, the chief scientific officer at identity verification company iProov, says he has seen a “massive change” in the number of face-swapping systems being used in the last year, tracking more than 100 different tools. Only a fraction of these tools allow for real-time face-swapping, he says. “We have seen quite big advances in terms of how good these live tools are.”

While the Yahoo Boys may not develop their own software or be technically sophisticated, Maimon says, they are versatile. They’ll run multiple scams at once, speak with dozens of victims at a time, and different individuals will have the skills needed to complete an entire scam. “There's a constant evolution,” Maimon says.

Ronnie Tokazowski, the chief fraud fighter at Intelligence for Good, which works with cybercrime victims, says because the Yahoo Boys have used deepfakes for romance scams, they’ll pivot to using the technology for their other scams. “This is kind of an early warning where it's like: ‘OK, they’re really good at doing these things. Now, what’s the next thing they're going to do?’”