Safety Standards Policy

Last Update: Feb 2025

Commitment to Child Safety

Walky is committed to creating a safe and secure environment for all users, especially minors. We have a zero-tolerance policy for child sexual abuse and exploitation (CSAE) and strictly enforce measures to prevent any form of harm to children on our platform.

Definitions

Child Sexual Abuse and Exploitation (CSAE): CSAE refers to any content, activity, or behavior that sexually exploits, abuses, or endangers children. This includes but is not limited to:

  • Grooming a child for sexual exploitation
  • Sextorting a child
  • Trafficking a child for sex
  • Sharing, promoting, or producing sexually explicit material involving a minor

Child Sexual Abuse Material (CSAM): CSAM consists of any visual depiction (including photos, videos, and computer-generated imagery) involving the use of a minor in sexually explicit conduct. It is illegal, and our Terms of Service strictly prohibit the use of Walky for storing, sharing, or distributing CSAM.
For more information, visit Google’s Transparency Report Help Center.

Published Standards on CSAE & Child Safety

Walky maintains a publicly accessible resource outlining our policies and standards regarding CSAE. This page:

  • Is globally accessible and functional
  • Clearly outlines Walky’s stance on CSAE and child safety
  • References Walky by name as it appears on the Google Play Store
  • Provides anchor links for easy navigation

This resource is available at: walkyapp.com/safety and is incorporated into our Help Center, Terms of Service, and Community Guidelines.

In-App Feedback Mechanism

Walky provides users with an in-app system to report violations without leaving the app. Our system includes:

  • Report’ and ‘Block options in profiles to flag inappropriate behavior or messages.
  • A dedicated support email (safety@walkyapp.com)

Users can block and report concerns directly in the app. Reports are reviewed promptly by our safety team.

Additionally, users can contact our Trust & Safety team at safety@walkyapp.com, listed within the same reporting section for further support.

All reports are reviewed promptly, and appropriate action is taken based on our safety policies.

Addressing Child Sexual Abuse Material (CSAM)

Walky follows strict procedures to detect and remove CSAM:

  • Automated detection systems and third-party moderation tools help flag potential violations.
  • Human moderators review flagged accounts and investigate reports.
  • Immediate removal of CSAM when we obtain actual knowledge of it.
  • Reporting to law enforcement and child protection organizations, as required by law.

Walky complies with all applicable child safety laws and follows best practices outlined by the Tech Coalition Child Safety Standards.

Compliance with Child Safety Laws

Walky adheres to all relevant laws and regulations, including:

  • Children’s Online Privacy Protection Act (COPPA)
  • General Data Protection Regulation – Kids (GDPR-K)
  • EARN IT Act and other applicable U.S. laws
  • Country-specific regulations governing child safety and digital platforms

Walky’s age-verification measures and content moderation policies align with industry standards.

Enforcement & Policy Updates

Failure to comply with Walky’s Child Safety Standards may result in:

  • Account suspension or permanent bans
  • Content removal
  • Legal reporting to authorities
  • Potential removal from Google Play

We conduct regular policy reviews and update our standards in response to evolving safety needs, regulatory changes, and best practices.

 

CSAM Prevention Point of Contact

To ensure child safety concerns are addressed promptly, Walky has designated a CSAM Compliance Contact to serve as the primary point of contact for inquiries related to child safety and compliance.

This individual is responsible for ensuring regulatory compliance and acting as a liaison with Google Play and legal authorities if needed.

CSAM Contact Information: Jeremiah Newman
safety@walkyapp.com