Federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession of child sexual abuse material (CSAM). By definition, any sexually explicit content featuring a minor is exploitative and unlawful. The creation of CSAM results in a permanent record of a child’s victimization. (Source: “Child Sexual Abuse Material,” U.S. Department of Justice, June 2023: Link)
The owners and operators of the AZMen platform are extremely diligent in ensuring that our website is never used to violate federal law or facilitate harm toward minors through the creation or distribution of CSAM. We find the creation or distribution of child sexual abuse material abhorrent in every form, and we investigate every single case thoroughly and go above and beyond standard reporting requirements to safeguard children.
What is CSAM?
CSAM refers to any image or video depicting sexually explicit conduct, including nudity, involving individuals under the age of 18. Such material represents child sexual abuse and exploitation in every instance.
We report all suspected incidents of CSAM to the National Center for Missing & Exploited Children (NCMEC).
How does AZMen identify CSAM on its platform?
We employ a comprehensive strategy involving cutting-edge technology, human moderators, and strategic partnerships with organizations such as NCMEC to identify, remove, and report CSAM. Our approach includes:
- Automated Technology: We leverage advanced scanning tools, including the Microsoft PhotoDNA technology, to detect known CSAM. We also utilize the TakeItDown hashlist from NCMEC to proactively prevent the posting of harmful content. Additionally, we have integrated Cloudflare’s CSAM Scanning Tool, which enables us to proactively identify CSAM served through our website’s cache. This tool compares cached content against known CSAM lists provided by leading child safety advocacy groups, such as the National Center for Missing & Exploited Children (NCMEC), ensuring rapid detection and action at the infrastructure level. (Source: Cloudflare CSAM Scanning Documentation)
- Human Review: All our content moderators undergo rigorous training to identify, remove, and promptly escalate any suspected CSAM. Our initial automated inspection—including cache-level scanning—is followed by a manual review within 24 hours to ensure thorough oversight.
Any content flagged as suspicious is immediately removed from public view and subjected to a deeper investigation by our safety team, which is specially trained to handle and report CSAM-related cases.
How does AZMen detect new CSAM?
“New” CSAM is material not previously identified in existing databases and thus may be more challenging to detect. We diligently analyze images, text, and audio components to uncover potentially new CSAM. Any discovery of new or suspected CSAM is promptly escalated, reported to relevant law enforcement partners, and shared with non-governmental organizations to disrupt further distribution.
What happens when AZMen finds suspected CSAM on its platform?
Upon identifying suspected CSAM, we act immediately to remove the content and file a “CyberTipline” report with NCMEC.
NCMEC reviews these reports and shares them with relevant law enforcement agencies worldwide. We cooperate fully with law enforcement to investigate, prosecute, and enforce penalties against individuals who misuse our platform for CSAM.
We also conduct internal investigations against any user attempting to share CSAM on AZMen and ban those who violate our policies. Our proactive measures ensure that we exceed mandated reporting requirements and sustain a secure environment for all users.
How do I report suspected CSAM?
If you encounter any content on AZMen that you suspect constitutes CSAM, please click the report button or email us immediately at contact@azmen.com.
What else does AZMen do to prevent the creation or distribution of CSAM?
Our team maintains active partnerships with governments, regulatory bodies, law enforcement, non-governmental organizations, charities, and other technology platforms to jointly combat CSAM. We respond immediately to intelligence from trusted safety partners about potential threats or emerging child safety risks. We also collaborate with academic researchers and subject-matter experts to remain abreast of new tactics used by offenders, thereby enhancing our preventative strategies.
We utilize NCMEC’s TakeItDown initiative to proactively prevent the upload of self-generated content involving minors (which is considered CSAM). For individuals under 18 concerned about the unauthorized distribution of personal images, the TakeItDown service enables the generation of unique hashes that help block such content from being uploaded to participating platforms, including AZMen.
If you wish to learn more about our efforts in combatting the creation or distribution of CSAM, please contact us at contact@azmen.com.
Collaboration with Law Enforcement
In our unwavering commitment to safeguarding children, AZMen collaborates closely with international and national law enforcement authorities. We have implemented a robust tool for preventing and combating the uploading, display, and dissemination of CSAM or non-consensual sexual acts. In addition to allowing authorities to securely download the reported video, this tool provides key information, which may include:
- Date and time of video upload
- Date and time of video removal
- Uploader’s email address
- Uploader’s IP address
- Port associated with the uploader’s IP address
- URL(s) where the video appeared
- Video thumbnails for identification
We forward all reports involving minors to the law enforcement agencies of various countries, based upon the IP address of the uploader:
IP of the Uploader | Notice Forwarded To |
---|---|
BRAZIL | Polícia Civil Do Distrito Federal, Portal da Polícia Federal, Precint Crime Suppression of IT |
CZECH REPUBLIC | Národní centrála proti organizovanému zločinu Odbor kybernetické kriminality |
UK | The National Crime Agency’s Child Exploitation and Online Protection command (NCA) |
FRANCE | Police Nationale Sous-direction de lutte contre la cybercriminalité Office central de lutte contre la criminalité liée aux technologies de l’information et de la communication and Pôle Judiciaire de la Gendarmerie Nationale STRJD – Division de lutte contre la cybercriminalité Département de répression des atteintes aux mineurs sur Internet (RAMI) Analyste CNAIP / Enquêteur NTECH |
GERMANY | Bundeskriminalamt (BKA) |
ITALY | Polizia Postale e delle Comunicazioni |
PERU | Policia Nacional del Perú, División de Investigación de Delitos de Alta Tecnologia |
USA | Federal Bureau of Investigation (FBI) |
SWITZERLAND | KOBIK - Koordinationsstelle zur Bekämpfung der Internet-Kriminalität |
Our proactive approach, stringent monitoring, and strict partnerships with law enforcement agencies worldwide fortify our commitment to eradicating CSAM. AZMen stands resolute in its mission to protect children, ensuring that any attempt to exploit minors on our platform is met with swift and decisive action.