Skip to content

Trends in Financial Sextortion: An investigation of sextortion reports in NCMEC CyberTipline data

In partnership with the National Center for Missing & Exploited Children (NCMEC), this research examines more than 15 million reports made to the CyberTipline from 2020 to 2023 to pinpoint cases of sextortion and examine the evolving scale and nature of financially motivated sextortion. Importantly, while sextortion can affect all ages, this report focuses explicitly on the sextortion of minors.

June 24, 2024

20 Minute Read

Key Findings
  1. Sextortion, and particularly financial sextortion, continues to be a major and ongoing threat, with an average of 812 reports of sextortion per week to NCMEC in the last year of data analyzed, and with reason to expect that the vast majority of those reports are financial sextortion.
  2. Perpetrators leverage tactics to intentionally fan a victim’s worry about the life-changing impacts of their nudes being shared—often repeating claims that it will “ruin their life.”
  3. While we find that Instagram and Snapchat are the most common platforms used for sextortion, we observe trends regarding the emergence of additional end-to-end encrypted messaging apps to move victims to secondary platforms and the prevalence of Cash App and gift cards for methods of payment.
  4. The two countries from which sextortion perpetrators are often operating, Nigeria and Cote d’Ivoire, make use of slightly different tactics and platforms.
  5. Reports submitted by Instagram constitute a clear majority of all reports of apparent sextortion submitted to NCMEC. However, there are reasons to worry whether other platforms are underreporting.
Research Conducted in Partnership:

Introduction

Sextortion – threatening to expose sexual images of someone if they don’t yield to demands- has been a source of harm to youth for some time. In the last several years, concerns about a unique type of sextortion – financial sextortion – have been on the rise. Reports of financial sextortion revolve around demands for money and predominantly target boys, and young men. In addition, financial sextortion marks the emergence of new organized offenders leveraging technology to target and extort minors at scale. 

In partnership with the National Center for Missing & Exploited Children (NCMEC), this research examines more than 15 million reports made to the CyberTipline from 2020 to 2023 to pinpoint cases of sextortion and examine the evolving scale and nature of financially motivated sextortion. Importantly, while sextortion can affect all ages, this report focuses explicitly on the sextortion of minors.

This page reviews several key findings from Thorn’s report on trends in financial sextortion and identifies opportunities for action to safeguard young people from this growing threat effectively. To read the complete research report and explore its findings in greater detail, please download the PDF.

The Rise of Financial Sextortion

Between 3.5 and 5% of people are believed to have experienced sextortion before reaching adulthood, with girls historically more likely than boys to be impacted. Previous surveys have found demands most often were sexual or relational, including but not limited to demands for additional intimate imagery, engaging in sexual acts, or returning or staying in a romantic relationship. Sextortion tied to financial demands was limited, reported in less than 10% of cases.

The overall trend in NCMEC reports shows a large wave of sextortion cases since the beginning of 2022; of the 144 million reports made to NCMEC that year, the hotline received 80,524 reports of online enticement (the reporting category inclusive of sextortion cases), reflecting an 82% increase over the year prior. Although the numbers do not, on their face, differentiate among types of sextortion, analysis of report details demonstrates this increase is driven mainly by reports involving financial sextortion. 

NCMEC received an average of 812 sextortion reports per week.

More than two-thirds of these reports appear to be financially motivated.

Sextortion demands over time

Reports per week, split by demands of the perpetrator

Unlike historical sextortion reports, this surge of cases targets new groups, with 90% of victims detected in NCMEC reports being male, aged 14 to 17.

90% of victims detected in NCMEC reports were male, aged 14 to 17.

In addition, unlike historical cases where roughly half of the victims knew their abuser from their offline community, the offenders behind these cases appear to be concentrated internationally, with 47% of reports showing ties to Nigeria and Cote d’Ivoire. Additional countries, though in lower volumes, also appear in reports, including the United States, Philippines, United Kingdom, and India.

Given the data originates from NCMEC CyberTipline data (the US hotline to report online child sexual exploitation), it can be expected that some bias exists relating to the geographic distribution of cases appearing in this study.

Victim Impact

Explicit discussion of impacts on the victim was available in only 9% of the cases studied. Accounts among these cases show a range of experiences, including continued harassment after paying, image distribution, and mental health consequences, including depression, anxiety, and thoughts of self-harm.

Victim impacts

For the 8.9% of reports where any victim impacts were reported

Note: Categories are exclusive. Any instances where a report described a combination of multiple harm types is solely represented in the “multiple impacts” category.

Continued extortion

Roughly one in three (38%) reports with impact information mentioned making payments. That being said, payments often did not stop the harassment, and 27% of victims who mentioned paying their perpetrator discussed ongoing demands experienced after their first payment.

Other research into sextortion published by the Canadian Center for Child Protection (a study involving both adult and minor experiences) found this outcome to be even more pronounced, with the large majority reporting threats continued even after paying.

Image Dissemination

The distribution of images was a dominant thread in discussions of victim impact. Often, the language used during threats focused on the release of images to highly public forums to increase the risk of viral spread and exposure. 

However, in accounts of those who reported their images were, in fact, leaked, distribution channels tended to be more narrowly focused on their immediate friends and family. Discussion of platforms often mentioned in threats of dissemination and reported experiences of dissemination is included later in this research.

 

Self-harm or suicide

When mental and emotional impacts on victims were reported, we split such content into two categories: a more severe category of discussions of suicidal ideation and/or self-harm and a more general category of “other victim concerns” and mental stresses. 

 

Among reports that include details on victim impact, 1 in 8 discuss thoughts of suicide or self-harm.

Importantly, only a subset of cases include mention of a victim’s situation in report texts—it can be assumed that many victims do not explicitly provide such information about the mental or emotional impact of this experience in the report.

However, other sources of information, such as news articles regarding sextortion victims, can shed additional light on how these pressure tactics and the overall threat of image sharing can culminate in severe consequences such as suicide.

In one such case, a news article notes that perpetrators told the child that he “…would be labeled a pedophile. His parents wouldn’t love him. He wouldn’t be able to get into college or get a job. They would hurt or kill his parents.”

Tactics

Although, historically, sextortion has often involved more time developing a relationship with the victim and more subtle forms of coercion and manipulation (at least in the earlier phases of the abuse), the tactics seen in reports of financial sextortion to NCMEC often featured aggressive, rapid exchanges highlighting potential life-altering outcomes if victims fail to pay.

Catfishing

In the majority of cases, victims appeared to share a picture in response to images initially sent by the perpetrator, who often appeared to be impersonating another young person (usually an attractive, similarly-aged child). This approach using “catfishing” could serve to lower the victim’s inhibitions to engage while helping to evade platform policies limiting interaction between adult and minor profiles.

Of the 31% of cases where specific methods are indicated for getting imagery from the child, 70% of reports had signs of reciprocating images in response to the perpetrator sharing imagery.

Acquisition information

Rate of reports with information about how imagery was required

Note: A report was counted in multiple categories if multiple tactics were used.

However, threats were not always reliant on a victim sharing imagery. In 11% of reports with information about how images were acquired, victims report that they did not send sexual imagery of themselves but were threatened with images that were in some way fake or inauthentic. An additional 6% describe accounts being hacked or images stolen from another account.

While generative AI technologies may be involved in cases of fake or inauthentic imagery, these reports also include less technically advanced tactics. Other descriptions included using another person’s explicit imagery and threatening to say it was that of the victim, and examples of less photorealistic instances of deepfake imagery, including the child’s likeness in a sexual manner.

Threats

The chart below outlines the overall frequency of the various tactics perpetrators used to pressure victims out of all reports where conversation data could be measured. The leading tactic cataloged in reports with chat logs included descriptions of life-altering impacts and ruin.

38% of reports with chat logs included exaggerated impacts and/or threats of ruining the victim’s life.

Beyond the threats to leak images, perpetrators describe outcomes such as the viral spread of their photos and distribution, resulting in them being kicked out of school, losing job opportunities, or facing criminal charges as sex offenders.

Pressure threats

Methods used to pressure victims out of chats with pressure tactics

Note: A report was counted in multiple categories if multiple tactics were used.

When these tactics show up in conversations, they are often formulaic; perpetrators use extremely similar or even identical threats against different victims as if operating off of a script designed to quickly and efficiently coerce victims to pay.

These tables showcase threats that perpetrators have repeated in this exact form four or more times over different reports:

Countdowns & Constant Contact

Perpetrators seem to employ a range of methods to attempt to make sure that their victims are required to make quick decisions, attempt to pay quickly, and do not have an opportunity to seek help from their caregivers or other sources of support.

We coded two ways in which perpetrators imposed such urgency: firstly, the use of countdowns and deadlines to make the victim rush to pay, and secondly, the demand for constant communication and access.

With countdowns and deadlines, perpetrators would give children fixed periods to encourage payment or to extract a promise of a method and amount of payment.

For the second method in which perpetrators demanded constant communication from the child, perpetrators would threaten to expose a child’s imagery if children simply disconnected from the video chat or did not respond in text chat quickly enough. 

Platforms

Sextortion does not happen in a vacuum; how children (and perpetrators) interact with platforms and specific design features can facilitate these sextortion events. We can see this by examining which platforms were used to initially contact children and which were used as a secondary location to which perpetrators would move the conversation. 

To avoid bias introduced by platform reporting behaviors, we analyzed how platforms were used in sextortion events via explicit platform mentions in the report text. Any platform mentions were coded to capture how the platform was being discussed.

Initial contact

Of the 3,276 reports that explicitly discussed how platforms were used, 576 had an “initial contact” label; a few core platforms dominate the studied reports as places used for that initial point of contact, as shown in the chart below.

Platforms used for initial contact

Platforms mentioned more than ten times as an initial meeting platform

Note: A report was counted in multiple categories if multiple platforms were discussed.

Instagram and Snapchat were the most common platforms named as initial contact points in the studied reports, followed by Facebook, Omegle, Wizz, and Wink.

1 in 10 reports with information about the platform of first meeting mentioned Omegle or Wizz as an initial contact point.

Omegle and Wizz, together were mentioned in roughly 10% of reports with initial contact mentions, highlight the role that platforms designed for randomly being connected to strangers might have in enabling perpetrators to get new connections to children (while Omegle shut down in the second half of 2023, many similar competitors exist).

Secondary contact

Thorn surveys have found that 65% of children had experienced someone attempting to get them to “move from a public chat into a private conversation on a different platform.”

This is a common event in sextortion situations, with perpetrators moving children to secondary platforms, possibly because a platform may be less likely to detect the abusive behavior and/or where the child may be more likely to share content.

When we look at platforms that were coded as being used as a “secondary location,” where a report identified that the victim was moved from one platform to another (869 reports mention such a secondary location), we found that the most common platforms to which these interactions were moved are Snapchat and GChat, as shown in the chart below. Other prominent platforms mentioned included WhatsApp, Telegram, Instagram, iMessage, Facebook, and Discord.

Platforms used as secondary destinations

Platforms mentioned as a destination where conversation is moved to

Note: Platforms included if mentioned 15 or more times in this role. A report was counted in multiple categories if multiple platforms were discussed. Google encompasses google messaging services but not Youtube.

Private messaging environments are popular among young people sending intimate images. A recent survey of youth found that 41% of teens who shared nudes did so via  “Texting or messaging apps like WhatsApp”, 39% did so via “DM (Direct Message) in apps where the content disappears, like Snapchat”, and 30% did so via “DM (Direct Message) in the messaging feature of a social media app like Instagram or Twitter.” The next most common channel was via FaceTime or other video call/chat.

Image Dissemination

Threats often focused on the victim’s imagery being leaked in a highly public fashion, fueling fears of lasting consequences and embarrassment. A wide list of platforms were named in threats of distribution, including some with broad public reach. However, the platforms named explicitly as the location of distribution tended towards more direct network distribution rather than a general public audience.

While any experience of distribution increases the harm to the victim, fewer public points of initial distribution offer the hope of limiting the extent of exposure of the material and increase the potential to stifle wide dissemination with resources like NCMEC’s Take It Down.

The chart below shows a comparison of threatened versus actual reported sites of image leaks. When looking at reports where victims stated that their images were actually leaked, the most common distribution platform mentioned was Instagram. Additional platforms named in reports of image dissemination, but at lower volumes, included YouTube, Facebook, Snapchat, Twitter, and TikTok.

Threatened dissemination platform

Platforms discussed as place where imagery would be disseminated

Note: Shows platforms mentioned 30 or more times for distribution. Of 1,837 reports with one or more platforms of threatened distribution, 102 confirmed distribution.

Payment

Chats frequently involve mention of payment methods or platforms. We measured named payment platforms and more general payment approaches such as gift cards or cryptocurrencies. 

The most common payment methods were CashApp and gift cards, followed by other easy-use payment apps such as PayPal and Venmo. The dominance of gift cards and Cash App has slightly increased over time relative to other payment services.

Platforms used for payment

Payment platforms mentioned in reports

Note: A report was counted in multiple categories if multiple payment methods were discussed.

Reporting Landscape

Reporting activity across Electronic Service Providers (ESPs) relating to sextortion is widely varied, both in terms of reporting frequency and report contents. ESP reports of sextortion are currently the overwhelmingly leading signal into NCMEC that a child is being sextorted, making up 85% of the total reports of sextortion during the sampled timeframe.

Dominant reporters

Three platforms stand out as the driving force behind these numbers: reports from Instagram constitute a large percentage of all ESP reports coming in where sextortion is reported, followed by Facebook and Snapchat.

There are a number of notable changes in the trends over time for reporting platforms, shown in the chart below (these trends are purely about the reporting platform itself and thus are not necessarily proportional to where the sextortion took place).

ESP reporting trends

Number of reports per week submitted by an ESP or to NCMEC public form.

Note: Platforms shown submitted 5 or more sextortion reports per week. The apparent drop in “Direct to NCMEC” reports between 2023-05 and 2023-08 is due to missing data (data not available at time of analysis) rather than due to a drop in submitted reports.

The first is a sharp increase of cases submitted by ESPs starting in the middle of 2022; that increase could reflect the actual increase in sextortion at that time but might also reflect work that NCMEC did in raising alarms about financial sextortion to the platforms in June of 2022.

This data shows a dip in reports in the first half of 2023, but is followed by a return to this high rate by the last period studied, August 2023. This is reflected in increases both in Instagram and Snapchat data in August of 2023.

As Instagram reports constitute the majority of all reported sextortion, those changes in Instagram reports can dwarf all other trends in order to define the overall trends in sextortion.

Reporting timeframes

The time it takes for an ESP to become aware of a sextortion event (either through proactive solutions or user reporting) and subsequently assess and report the event to NCMEC impacts how closely the overall volume of reports to NCMEC reflects the volume of sextortion events over time. 

Put another way, the data suggests the dip in reports by Instagram and Facebook in May 2023 is less a reflection of a drop in sextortion activity on these platforms and more likely a reflection of changes in the content moderation pipeline submitting these reports to NCMEC.

Several things may influence the time between the event and a report to NCMEC by an ESP, not all of which are in control of the ESP submitting the report. Among them are the existence and efficacy of proactive detection practices, changes in offender tactics, dependence on user reporting, and efficiency of content moderation pipelines.

The chart below presents the median time between sextortion events and their reports for the main three sextortion-reporting platforms over the last two years, showing variable referral speeds during the studied window of reports. 

Facebook and Instagram report referrals accelerated at the beginning of the study window, stabilizing with report referrals within days of the incident for the second half of 2022. However, in 2023, report referral time increased, arriving at a median period of more than a month by August 2023.

Reporting delay

Median number of days between incident and submission of report to NCMEC

Initial analysis of Facebook and Instagram reports from November 2023 indicates shorter report referral times. This suggests the dip followed by an increase in reporting volume may be due to slower reporting rather than an actual dip and increase in sextortion activity. 

Similarly, the data suggests the spike in report activity in late 2022 is not a result of a reporting backlog, as the time between event and report is relatively short, but may point to improved detection and/or increased sextortion activity. 

Although increasing time lags in reporting are concerning, it is important to recognize the inherent difficulties in detecting and responding to sextortion cases. We do not know if these delays are primarily due to changes in content moderation, advances in detection technology, or shifts in user or perpetrator behavior. Furthermore, we should acknowledge the positive impact of periods where platforms were swiftly responding to sextortion events and focus on how platforms can be encouraged to maintain such responsive reporting to NCMEC.

Report Contents

When an ESP observes signals suggesting immediate risk to a minor, they may utilize an “ESP escalation” field in which a report can be flagged to NCMEC for more urgent study, with a summary characterizing the event such as “This account is sextorting an apparent minor.”

Historically, Facebook and Instagram have frequently escalated sextortion cases and provided additional context, such as chat information. Snapchat had typically provided basic report information but had seldom escalated cases and rarely included chat information or child victim information.

However, reports from Snapchat have seen some increase in report detail with the addition of small chat excerpts or other information about the victim.

The timeliness and detail provided in NCMEC reports impact how quickly and effectively we are combating financial sextortion and safeguarding victims. Changes to platform design and content moderation policies, particularly when evaluating the implementation of encryption, must account for this to mitigate unintended impacts on child safety.

Gaps in reporting

The number of platform reports of sextortion is not directly equal to the number of sextortion events on that platform. In fact, higher volumes of reports can also signal increased platform investment in detecting and combating this harm type on their service. 

By examining the number of times specific platforms are mentioned in public reports (as compared to the volume of reports made to NCMEC by the individual ESPs), we can offer one tentative way to estimate how many reports one might expect. 

The chart below shows a distribution over how often platforms are mentioned in sextortion cases submitted to NCMEC by the public (via public form or hotline) over the last three years, as logged by NCMEC analysts. It shows that Snapchat is mentioned almost as often as Instagram and that there are a range of platforms that are mentioned more than 20 times in any direct reports to NCMEC.

Platform mention in public reports

Platforms mentioned 30 or more times in sextortion cases

Note: Mentions of Google encompassed messaging services but not Youtube.

In this data, we see gaps between how many reports of sextortion an ESP submits to NCMEC, compared to how often that ESP is mentioned in public reports. For example, while Snapchat was mentioned nearly as often as Instagram and far more than Facebook in public reports, report volume directly from the platform is almost half that of Facebook and a quarter as much as Instagram. However, the latest report period in August showed an uptick in Snapchat reports, indicating possible improvements in reporting workflows.

Similarly, Discord was mentioned in public reports at a somewhat similar pace to Omegle or Wink (all being mentioned in sextortion reports roughly once every two weeks) and half as often as WhatsApp or Wizz, but Discord submitted more reports of sextortion to NCMEC in the periods we sampled than all of those platforms combined (although some platforms such as WhatsApp may relay reports to other platforms.)

Note: No reports were submitted by Wizz. ‘Google’ here encompasses Google messaging products but does not count mentions of Youtube. Meta at times submits a single CyberTipline report regarding an event involving multiple Meta services (for example a report made by Facebook may include sextortion occurring on WhatsApp). In these instances, the data in this study only reflects the specific platform that made the report.

Lower rates than anticipated by comparing public and ESP reports could occur for several reasons. Some platforms may be better at proactive detection, and some parts of the sextortion experience may be more likely to be reported by the victim. In addition, some platforms may not have reporting flows that allow victims to easily convey that sextortion is occurring or may fail to pass along the data clearly to NCMEC. 

Financial payment platforms may also report to NCMEC. We observe sextortion reports from PayPal Inc., which includes both Venmo and PayPal. Other financial service companies are not registered to report to NCMEC, or do not make substantive reports, for example, Block Inc., which includes CashApp, the most commonly mentioned payment platform in sextortion reports we examined.

Looking ahead

The analysis of data available in CyberTipline reports affords us concrete data regarding the scale, tactics, and impacts of financial sextortion. The emergence of organized, global threats demands cross-sector and multi-layered collaboration to effectively safeguard young people from this growing threat. Since the final reports were analyzed in this study, we’ve been heartened to see several examples of this type of innovation and collaboration, and we’re eager to see this work continue.

Technology is being used to scale attacks while threats of dire outcomes – which often mirror societal warnings – fuel victim isolation.

Financial sextortion is a global phenomenon. Detection tools, moderation endeavors, and prevention messaging must be built with this in mind, serving both English and non-English-speaking young people.

Though most victims are manipulated into sharing intimate images in response to catfishing and responding to an initial share of intimate imagery from a perpetrator, financial sextortion does not exclusively target children who have shared an intimate image, and use of genAI technologies may lead to an increase in these cases.

Financial sextortion relies heavily on inflaming a victim’s fears around the impact of having their nudes exposed, such as that they would go viral or send the child to jail.

Safeguarding tactics must evolve beyond ‘just don’t share images’ and if you do, ‘you’re life will be ruined’. The threat of life-altering consequences is being weaponized to silence and isolate victims.

Platform reporting of sextortion is widely varied, and sextortion is less likely to be reported on some platforms.

Platform reports play a vital role in identifying and escalating sextortion activity to NCMEC, as well as informing the public’s understanding of the evolving nature of sextortion. However, not all reports are created equal. Reports range in detail. This not only impacts the ability to action the report to safeguard impacted victims, it also limits our ability to explore the issue and the efficacy of existing platform interventions ranging from user reporting to proactive detection as well as develop novel solutions to combat the threat.

Further, reports from the public concerning financial sextortion suggest a far wider list of impacted platforms than are actively reporting to the CyberTipline. While opportunities exist to optimize report contents among active reporting platforms, current reporting volumes from some platforms lag considerably behind public reports. A multitude of tactics can be deployed, including in private messaging services, to improve user safety. Layering proactive and reactive solutions can serve to both deter abuse of services and encourage user-controlled safety actions.

Methodology

Samples and analysis

This research was conducted by Thorn in partnership with the National Center for Missing & Exploited Children (NCMEC). NCMEC is a private, non-profit 501(c)(3) corporation whose mission is to help find missing children, reduce child sexual exploitation, and prevent child victimization.

NCMEC received over 32 million reports in 2022. To focus on a representative but reasonable amount of data, our analysis concentrated on a subset of reports received between 2020 and 2023. We defined four two-week periods each year for the last three years (every three months, starting on the 8th and ending on the 21st of February, May, August, and November) and studied all reports submitted to NCMEC within those periods (the sampled data totals more than 15 million reports).

This study was limited to specific fields within CyberTipline reports, as prepared and provided by NCMEC, and did not include attached files such as screenshots of chat logs or other image files.

Our analysis highlighted all reports appearing to relate to sextortion, building off initial annotations provided by NCMEC analysts as part of the report intake process. This report started from those annotations and augmented the initial sample of sextortion reports using machine learning algorithms to identify potential sextortion cases for additional annotation. Cases flagged through this process were manually reviewed to verify inclusion for analysis. Reports in the sampled data were manually coded to measure variables such as tactics, platform mentions, monetary quantities, and victim impacts.

limitations
  • This analysis is limited to the data contained in CyberTipline reports, as provided by NCMEC. As such, events not reported to the CyberTipline or data not included in the report’s contents are not included.
  • Analyzed data is from sampled time windows and thus can only be used as an estimate.
  • This is a rapidly evolving abuse type. Tactics and tools may have evolved since the last studied reports.

Download the full report for a more detailed discussion of methodology and limitations.

Suggested citation

Thorn and National Center for Missing and Exploited Children (NCMEC). (2024). Trends in Financial Sextortion: An investigation of sextortion reports in NCMEC CyberTipline data.

 

Download Full Report

Resources

Financial sextortion continues to be a major issue, and it is important to have resources available that can address financial sextortion and help children. Some important resources are provided below:

For those experiencing sextortion

If you believe you or someone you know is a victim of exploitation, file a report with NCMEC’s CyberTipline, or ask for help directly at contactgethelp@ncmec.org or 1-800-THE-LOST.

For those located outside of the US, use the InHope hotline directory to find your local hotline.

Learn more about the steps you can take:

For those worried about their imagery being shared

Take It Down is a free service that can help you remove or stop the online sharing of nude, partially nude, or sexually explicit images or videos taken of you when you were under 18 years old. You can remain anonymous while using the service, and you won’t have to send your images or videos to anyone.

If you are over the age of 18 years old, you can initiate a case with StopNCII.

For more information about what to do when your imagery is at risk, check out NCMEC’s resources.

Additional resources and information on sextortion

To learn more about sextortion, head over to our dedicated topic page on Grooming and Sextortion.

For more information and resources, take a look at the following:

Interested in Thorn research?

Join our research distribution list to stay up to date on our latest research findings.

Sign Up