Skip to content

Safety Standards

Last Updated: October 29, 2025

Our Commitment to Safety

trxy™ is committed to maintaining a safe environment for our skateboarding community. We have zero tolerance for child sexual abuse and exploitation (CSAE) and employ comprehensive technical and operational measures to prevent, detect, and respond to such content.

Child Safety Standards

Age Restrictions

  • trxy™ is restricted to users aged 13 and older
  • We do not knowingly collect information from children under 13
  • Age verification occurs during account creation
  • Suspected underage accounts are immediately suspended and investigated

Prohibited Content

We strictly prohibit:

  • Child sexual abuse material (CSAM) of any kind
  • Content that sexualizes, grooms, or exploits minors
  • Content depicting, encouraging, or facilitating child abuse or exploitation
  • Sharing of personal information of minors without consent
  • Predatory behavior or communications targeting minors
  • Any content that violates laws protecting children

Technical Safety Measures

Automated Content Moderation

We employ industry-leading AI and machine learning tools to protect our community:

Text Content Moderation

  • Google Perspective API analyzes all user-generated text in real-time
  • Automatically detects and blocks toxic, threatening, or sexually explicit content
  • Flags potentially harmful communications for review
  • Applies to challenges, comments, profile information, and all text-based content

Image Content Moderation

  • Google Cloud Vision API scans all uploaded images and videos
  • Detects explicit content, violence, and other unsafe material
  • Images flagged as unsafe are immediately blocked from upload
  • Multi-layered detection includes:
    • SafeSearch detection for adult content
    • Violence detection
    • Racy content detection
    • Custom safety classifiers

Proactive Detection

  • Automated scanning of all user-generated content before publication
  • Hash-matching against known CSAM databases
  • Pattern recognition for grooming behaviors
  • Suspicious activity flagging and alerts
  • Regular security audits and updates to detection systems

Human Moderation

In addition to automated systems, we maintain:

  • Manual review of flagged content within 24 hours
  • Trained moderation team familiar with child safety protocols
  • Clear escalation procedures for serious violations
  • Regular training on emerging threats and safety best practices

Reporting Mechanisms

In-App Reporting

Users can report unsafe content or behavior directly within the app:

  • Report button available on all user-generated content (profiles, challenges, spots, comments)
  • Dedicated "Child Safety Violation" category for content involving minors in unsafe or inappropriate situations
  • Additional categories: Harassment/Bullying, Inappropriate Content, Dangerous Activity, and more
  • Anonymous reporting option
  • Immediate flagging for priority review
  • Reporter protection from retaliation

External Reporting

For urgent safety concerns:

Response Procedures

Report Review Process

All user reports are reviewed according to the following process:

Daily Report Triage

  • All incoming reports (including "Child Safety Violation" reports) are reviewed daily
  • Each report is assessed for validity and severity
  • Most reports are resolved through standard moderation actions

Confirmed CSAM Response

When a report is confirmed to contain actual CSAM or child exploitation during our review, we take immediate action the same day:

  1. Immediate content removal - Content is removed within minutes of confirmation
  2. Account suspension/ban - User account is immediately terminated with IP ban
  3. Evidence preservation - All relevant data (images, metadata, user information) is securely preserved
  4. Same-day reporting - Report submitted to NCMEC CyberTipline or relevant international authority on the same business day, in compliance with "as soon as reasonably possible" requirements under 18 U.S.C. § 2258A
  5. Law enforcement cooperation - Full cooperation with any subsequent investigations

Law Enforcement Cooperation

We fully cooperate with law enforcement agencies:

NCMEC CyberTipline Reporting

When we obtain actual knowledge of CSAM or child exploitation on our platform, we:

  1. Report to NCMEC (National Center for Missing & Exploited Children) via the CyberTipline as soon as reasonably possible, in compliance with 18 U.S.C. § 2258A
  2. Preserve evidence for one year minimum in accordance with the REPORT Act (2024)
  3. Secure data following NIST Cybersecurity Framework standards
  4. Cooperate fully with law enforcement investigations
  5. Provide information to authorized agencies upon lawful request

International Reporting

For users outside the United States, we report to relevant regional and national authorities:

  • International: INTERPOL, INHOPE hotlines
  • Europe: EU Safer Internet Centres
  • UK: Internet Watch Foundation (IWF)
  • Canada: Canadian Centre for Child Protection (Cybertip.ca)
  • Australia: eSafety Commissioner
  • Other regions: Local law enforcement and child protection agencies

User Account Actions

Violations result in:

  • Permanent account termination
  • Device and IP address bans
  • Reporting to authorities when legally required
  • No appeals for CSAE violations

Privacy and Data Protection

We balance safety with privacy by:

  • Only scanning content necessary for safety
  • Encrypting stored evidence
  • Limiting access to authorized personnel
  • Following data retention policies that comply with legal requirements
  • Protecting reporter identities

Ongoing Safety Improvements

We continuously enhance our safety measures through:

  • Regular updates to detection algorithms
  • Monitoring emerging threats and trends
  • Participating in industry safety coalitions
  • Soliciting feedback from safety experts
  • Quarterly safety audits
  • Staff training on child safety protocols

Transparency

We are committed to transparency in our safety efforts:

  • Annual publication of safety metrics (when meaningful data exists)
  • Regular updates to safety policies
  • Community education on safety features
  • Clear communication about safety violations

Contact for Safety Concerns

General Safety Questions

Emergency Safety Reports

Our safety standards comply with:

  • 18 U.S.C. § 2258A - CSAM reporting requirements to NCMEC
  • REPORT Act (2024) - Extended data retention and reporting obligations
  • Children's Online Privacy Protection Act (COPPA) - Protection for users under 13
  • CAN-SPAM Act - Email communication standards
  • State and international child protection laws - Applicable regional regulations
  • Google Play Child Safety Standards - Platform-specific requirements

Parental Resources

We provide resources for parents and guardians:

  • Safety tips for monitoring minor use (ages 13+)
  • Information about our safety features
  • Guidance on discussing online safety
  • Contact information for concerns

Note: If you believe a child is in immediate danger, contact local law enforcement or call the National Child Abuse Hotline at 1-800-4-A-CHILD (1-800-422-4453).

Changes to Safety Standards

We may update these safety standards as we enhance our protections. Material changes will be communicated through:

  • In-app notifications
  • Email to registered users
  • Updates on this page

For questions about these safety standards, contact us at trxy.beta@gmail.com.