At NotNock, we take the safety of children on our platform seriously and are committed to maintaining a safe environment for all users. We comply with Google Play's Child Safety Standards policy and have implemented comprehensive measures to prevent child sexual abuse and exploitation.

This page outlines our Child Safety Standards policy and the measures we have in place to protect children on our platform. We encourage all users to familiarize themselves with these standards and to report any concerns or violations.

Policy Overview

Google Play takes seriously the safety of children on their platform and is committed to working to keep their store free of child sexual abuse and exploitation. As an app in the Social category, NotNock complies with Google Play's Child Safety Standards policy, which requires the following:

  • Published standards against child sexual abuse and exploitation (CSAE)
  • In-app mechanism for user feedback
  • Measures to address child sexual abuse material (CSAM)
  • Compliance with child safety laws
  • Designated child safety point of contact

Our Standards Against Child Sexual Abuse and Exploitation

NotNock has a zero-tolerance policy for any content, behavior, or activity that exploits, abuses, or harms children. We prohibit:

  • Content that sexually exploits or abuses children, including but not limited to child sexual abuse material (CSAM)
  • Content that presents children in a sexualized context
  • Content that promotes, encourages, or facilitates the sexual exploitation of children
  • Behavior that targets children for sexual purposes, including grooming
  • Any attempt to use our platform to identify, contact, or communicate with children for sexual purposes
  • Sharing of personal information or images of children without appropriate consent

Any user found to be violating these standards will be immediately reported to the appropriate authorities, and their account will be permanently banned from our platform.

User Feedback Mechanism

NotNock provides multiple channels for users to report concerns, violations, or inappropriate content:

  • In-App Reporting: Every profile, message, and content piece has a "Report" option that allows users to flag inappropriate content directly within the app.
  • Dedicated Report Form: A comprehensive reporting form is available in the app's Safety Center, allowing detailed descriptions of concerns.
  • 24/7 Monitoring: Our moderation team reviews all reports prioritizing those related to child safety.
  • Emergency Reporting: For urgent concerns involving imminent harm to a child, users can access our emergency reporting feature, which alerts our safety team immediately.

We commit to reviewing all reports promptly, with child safety concerns receiving the highest priority. Users who submit reports will receive acknowledgment of their report and, when appropriate, information about the actions taken.

Addressing Child Sexual Abuse Material (CSAM)

NotNock employs a multi-layered approach to prevent, detect, and remove CSAM from our platform:

  • Proactive Detection: We utilize advanced image and video scanning technology to automatically detect potential CSAM before it is visible to users.
  • Hash Matching: We implement hash-matching technology that compares uploaded content against databases of known CSAM.
  • Human Moderation: Our trained content moderation team reviews flagged content and user reports related to potential CSAM.
  • Immediate Removal: Any confirmed CSAM is immediately removed, the account is terminated, and the incident is reported to the National Center for Missing & Exploited Children (NCMEC) and relevant law enforcement agencies.
  • Preservation of Evidence: We preserve all relevant data in accordance with legal requirements to assist law enforcement investigations.

We regularly review and update our detection systems to ensure they remain effective against evolving threats.

Compliance with Child Safety Laws

NotNock complies with all applicable laws and regulations related to child safety, including but not limited to:

  • Children's Online Privacy Protection Act (COPPA)
  • The Fight Online Sex Trafficking Act and Stop Enabling Sex Traffickers Act (FOSTA-SESTA)
  • Relevant sections of the Communications Decency Act
  • Local and international laws regarding child protection, exploitation, and abuse

Our compliance measures include:

  • Age verification processes to prevent children under 13 from creating accounts
  • Enhanced privacy settings for users between 13-17 years old
  • Limitations on direct messaging and content sharing for minor users
  • Regular legal reviews of our policies and practices
  • Mandatory reporting of CSAM to NCMEC and law enforcement

We continuously monitor changes in relevant legislation and update our practices accordingly.

Child Safety Point of Contact

NotNock has designated a Child Safety Officer who oversees all aspects of our child safety program and serves as the primary point of contact for:

  • Law enforcement inquiries related to child safety
  • NCMEC and other child safety organizations
  • Escalated user reports concerning child safety
  • Internal teams handling child safety concerns

Our Child Safety Officer can be reached at:

Email: childsafety@notnock.com

Response Time: All inquiries to our Child Safety Officer are acknowledged within 24 hours and prioritized based on urgency.

Additional Resources

For more information about child safety online, we recommend the following resources:

To learn more about Google Play's Child Safety Standards policy and best practices, visit Google Play's Developer Policy Center or review the Tech Coalition's best practices for Combating Online CSEA.

Our Commitment to Ongoing Improvement

NotNock is committed to continuously improving our child safety measures through:

  • Regular reviews and updates of our policies and procedures
  • Ongoing training for our staff on child safety issues
  • Investment in new technologies to detect and prevent CSAM
  • Collaboration with industry partners and child safety organizations
  • Regular transparency reporting on our child safety efforts

We welcome feedback from our users, partners, and the broader community on how we can further strengthen our child safety measures.

Powered by Alpha Match Technology Limited