top of page

Aftrmatch Policy on Child Sexual Abuse and Exploitation (CSAE)

At Aftrmatch, we believe in creating a safe, respectful, and inclusive environment where meaningful adult connections can thrive. While it Starts With a Swipe™, our responsibility to safety runs deeper—especially when it comes to protecting vulnerable individuals. We have zero tolerance for any form of Child Sexual Abuse and Exploitation (CSAE). This policy outlines our standards, expectations, and enforcement practices to keep Aftrmatch free from CSAE and to comply with applicable laws and industry best practices.

1. No Minors Allowed
Aftrmatch is strictly for adults aged 18 and older. Any account found to be operated by a minor, or any attempt to misrepresent age to access the platform, will result in immediate and permanent removal.

We do not allow:

  • Photos of unaccompanied or unclothed minors, including childhood images of yourself.

  • Language that suggests or implies attraction to minors.

  • Discussions or depictions of minors in a sexual context, under any circumstances.

2. Absolute Prohibition on CSAE Content and Behavior
Any content, communication, or behavior—whether on-platform or off-platform—that involves, suggests, promotes, or seeks to engage in CSAE will result in immediate termination of the account and will be reported to the appropriate authorities, including law enforcement and child protection agencies.

This includes but is not limited to:

  • Sharing, soliciting, or possessing child sexual abuse material (CSAM).

  • Grooming behavior—any attempt to establish an emotional connection with a minor to sexually exploit them.

  • Inappropriate conversations, fantasies, or roleplays involving minors.

  • Attempts to meet minors through Aftrmatch or by using it as a gateway to contact them elsewhere.

3. Mandatory Reporting and Legal Compliance
Aftrmatch complies with all applicable local, national, and international laws related to CSAE. We will report any credible suspicion of child sexual abuse or exploitation to the National Center for Missing & Exploited Children (NCMEC), law enforcement, or other designated authorities, as required by law.

We cooperate fully with law enforcement investigations and support initiatives that protect children online.

4. Technology and Human Moderation to Detect CSAE
Aftrmatch uses a combination of automated detection tools, AI content moderation, and trained human moderators to identify CSAE content and behaviors. This includes:

  • Hash-matching of known CSAM.

  • Behavioral pattern detection.

  • Manual review of flagged accounts and content.

5. Education, Awareness & Prevention
We’re committed to equipping our users with information and resources to identify and report CSAE. Aftrmatch provides:

  • In-app safety resources and support links.

  • Guidance on how to recognize grooming and exploitation.

  • Confidential channels to report suspicious behavior.

6. Strict Enforcement and Zero Tolerance
Violations of this policy will be treated with the highest severity and include:

  • Immediate and permanent account suspension.

  • Reporting to law enforcement and relevant child safety organizations.

  • Bans across the Aftrmatch platform and affiliated services.

  • No refund for users found in violation of this policy.

We do not issue warnings for CSAE-related infractions. This policy is enforced with zero tolerance.

7. Reporting CSAE
If you encounter behavior or content that you believe may be related to CSAE, please report it immediately through our in-app reporting tools or by contacting reach@aftrmatch.com. Your report will be treated confidentially, reviewed swiftly, and escalated appropriately.

Every report matters. By reporting CSAE, you are helping protect others and maintain the integrity of the Aftrmatch community.

Aftrmatch’s Commitment


We are unwavering in our commitment to fight child sexual abuse and exploitation in all forms. CSAE has no place here—not on our platform, not in our messages, and not in our community.

Aftrmatch reserves the right to investigate and take action on any off-platform behavior involving CSAE, especially if it endangers our users or others, even if the conduct occurred outside the app.

bottom of page