Published:Jan 27, 2026

Digital democracy|Dystopian new tech

Facebook led in deepfake-related fraud in 2025

What if you received a video call from a seemingly trusted friend, asking for urgent financial help? Would you question it? In the age of deepfakes, you must. Deepfake fraud is becoming increasingly sophisticated, and anyone can be a target.

A new Surfshark study identified how much money people lost to deepfake-related fraud in 2025 and what role social media platforms played in it.

Key insights

  • In 2025, deepfake-related losses from fraud and scams skyrocketed to $1.1 billion, tripling from $360 million in 2024 and marking a staggering ninefold increase from the $128 million total recorded between 2020 and 2023. A key driver of this surge is social media, which played a central role in these scams: 83% of all deepfake-related losses in 2025 originated on social media platforms, a sharp rise from just 33% in 2024.
  • By far, the three most targeted social media platforms for deepfake-related scams in 2025 were Facebook, WhatsApp, and Telegram. These three platforms accounted for 93% of deepfake scam losses originating on social media, with Facebook being the most common, resulting in $491 million in losses, followed by WhatsApp with $199 million, and Telegram with $167 million. Other social media platforms, such as TikTok, Instagram, and Threads, together accounted for nearly $36 million in losses, while an additional $31 million in losses occurred on social media platforms where the specific platform was not identified.
  • The predominant type of deepfake fraud in 2025 was the impersonation of famous individuals to promote fraudulent investment opportunities. This type of fraud accounted for 80% of total deepfake-related losses, and made up 96% of losses on social media platforms, amounting to $886 million. Scammers used deepfake videos and audio to convincingly pose as celebrities, business leaders, or financial experts, persuading victims to trust and invest in bogus schemes.
  • In 2025, another notable deepfake scam type was romance fraud, where scammers used realistic videos and audio to build fake romantic relationships with victims, later requesting money for urgent health crises or convincing them to invest in fraudulent schemes. Women were targeted in 57% of cases, while men accounted for 43%, with romance scams contributing to an estimated $10 million in losses. Although losses are smaller than those from investment scams, these highly targeted attacks often ruin the lives of victims, causing significant emotional and financial devastation.
  • AI became integrated into nearly every type of fraud in 2025, making even calls from family members or video interactions from romantic partners potentially fake, as scammers can easily impersonate voices and faces with deepfake technology. This threat is especially dangerous when meeting someone new online, where distinguishing real from fake is increasingly difficult. While platforms attempt to combat these scammers, their efforts often fall short, so users must remain highly cautious and distrust online content, particularly if it involves money.

Methodology and sources

This study used data from the AI Incident Database and Resemble.AI to create a combined dataset covering deepfake incidents from 2017 to September 2025. Incidents were included if they involved the generation of fake videos, images, or audio and were covered by media articles. For deepfake incidents related to fraud where a financial loss was clearly reported in the article, each case was further classified into one of 14 specific fraud subcategories. Additionally, we determined whether the scam originated on social media and, if so, identified the platform. In cases where an event mentioned multiple social media platforms, reported losses were divided equally among all mentioned platforms.

For the complete research material behind this study, visit here.

Data was collected from:

AI Incident Database (2025).Resemble.AI (2025). Deepfake Incident Database.
The team behind this research:About us