Child Sexual Abuse Material (CSAM) Policy
Last updated: [September 5th, 2025]
Goodchat is committed to the safety and protection of children. We maintain a zero-tolerance policy for any content that sexualizes, exploits, or endangers children. This includes but is not limited to:
- Images, videos, audio, or text depicting minors in a sexualized manner.
- Computer-generated, illustrated, or fictional depictions of child sexual abuse.
- Written content that references or alludes to the sexual exploitation of children, including in messages, titles, or profile descriptions.
Goodchat also supports the Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse, a global initiative launched by the Five Country Ministerial (5 Eyes) to combat online child sexual exploitation. The Five Country Ministerial, comprising security leaders from Australia, Canada, New Zealand, the United Kingdom, and the United States, promotes cross-sector collaboration to address global security challenges, including eradicating child sexual abuse material.
Scope
This policy applies to all Users and interactive features on the Service and supplements the Acceptable Use Policy.
Definitions
“Apparent Minor” means a person who reasonably appears to be a Minor.
“CSAM” means any content that depicts or sexually exploits a Minor, or that is illegal as child-sexual-abuse content under applicable law.
“Minor” means any person under 18 (or higher age of majority where applicable).
Prohibited Content
You must not request, send, upload, link to, generate, transform, describe, or otherwise provide any content on the Service (including to any Creator or any AI feature) that:
- Depicts, involves, or attempts to depict a Minor or Apparent Minor, real or fictional, in a sexual context or any other context including ordinary family/school photos. If age is uncertain, treat the person as a Minor and do not submit the content.
- Promotes, advocates, or references child sexual exploitation or abuse, including suggestive content about Minors.
- Includes text, images, or other media that could be interpreted as sexualizing or endangering Minors.
- Contains ambiguous or age-doubtful depictions of characters in a sexualized manner.
Detection and Moderation
We may use automated and human review to detect child-safety violations; we may block or remove content, suspend accounts, and restrict features to prevent harm. We may automatically filter or block media and links, including content that appears to depict a Minor.
Reporting and Removal
Anyone, including non-account holders, can report suspected CSAM in accordance with our Complaints Policy using Report Content or emailing abuse@goodchat.com with URLs, usernames, timestamps, and a description. Do not attach or send the imagery. We will acknowledge receipt and take appropriate action, including reporting to authorities where required. (Public reports to NCMEC are available at the CyberTipline.) Contact local law enforcement if a Minor is in imminent danger before reporting to Goodchat.
All flagged or detected content is immediately removed from user access. We preserve evidence securely as required by law and by lawful request. We do not provide copies of CSAM to reporters or users; access is restricted to trained personnel.
Users may appeal a removal decision related to suspected CSAM in accordance with our Appeals Policy. Appeals are reviewed within a reasonable period, in compliance with law, without delaying mandatory reporting requirements. Accounts involved in CSAM violations are generally not eligible for reinstatement, except in cases of verified error.
Mandatory Reporting
Goodchat shall comply with applicable child-protection and online-safety laws, including, without limitation, U.S. federal law (e.g., the PROTECT Act; 18 U.S.C. §§ 2256 and 2258A), EU Directive 2011/93/EU, and the UK Online Safety Act 2023. Upon becoming aware of apparent CSAM, Goodchat shall promptly report it to the National Center for Missing & Exploited Children (NCMEC) consistent with 18 U.S.C. § 2258A and, where required by non-U.S. law, to the relevant national hotline (including via the INHOPE network). Goodchat shall cooperate with competent authorities, including cross-border coordination through INTERPOL and Europol. Where required by law, or to avoid risk of harm or interference with investigations, Goodchat may delay or withhold user notice.
Preservation & Data Handling
We securely retain relevant data and content associated with suspected CSAM for a limited period in compliance with law enforcement requests and applicable data protection laws. We preserve relevant records and content for at least 90 days upon lawful request (e.g., from NCMEC or law enforcement), and longer if required by law or order. Access to preserved material is strictly limited to trained personnel on a need-to-know basis.
Transparency, Training, and Governance
We maintain child-safety training for trust-and-safety staff and keep internal procedures for escalation, reporting, and evidence handling. Where required (e.g., under the EU Digital Services Act or UK Online Safety Act), we maintain notice-and-action channels, cooperate with trusted flaggers/regulators, and include CSAM handling metrics in transparency reporting.
Additional Resources
If you need to report CSAM outside of Goodchat, or if you are seeking support, consider contacting the following organizations:
- National Center for Missing & Exploited Children (NCMEC)
- International Association of Internet Hotlines (INHOPE)
- Internet Watch Foundation (IWF)
- Canadian Centre for Child Protection
- ECPAT International
Updates
Goodchat regularly reviews and updates this policy to remain aligned with evolving legal requirements, best practices, and technological advancements in the fight against online child exploitation.