Our commitment
YOTCH Ltd maintains a zero-tolerance position on child sexual abuse and exploitation (CSAE) in any form. We are committed to ensuring the YOTCH platform is not used to harm, exploit, abuse, or endanger children.
This policy sets out our published standards in compliance with Google Play's Child Safety Standards policy. It covers how we define and prohibit CSAE, how users can report concerns, how we handle child sexual abuse material (CSAM), and how we comply with applicable child safety laws.
Zero tolerance. Any user found to be engaging in CSAE or distributing CSAM will be permanently banned from the platform and reported to the relevant authorities without exception.
Definitions
Child Sexual Abuse and Exploitation. Any content or behaviour that sexually exploits, abuses, or endangers a child. This includes grooming a child for sexual exploitation, sextortion, trafficking of a child for sex, or otherwise sexually exploiting a child in any way.
Child Sexual Abuse Material. Any visual depiction — including photographs, videos, and computer-generated imagery — involving the use of a minor engaging in sexually explicit conduct. CSAM is illegal and absolutely prohibited on YOTCH.
Any person under the age of 18, or under the age defined as a minor in their jurisdiction, whichever provides the greater protection.
Any behaviour by which a person builds trust with a child or their family with the intent to exploit, abuse, or harm that child. This includes establishing inappropriate relationships through the platform.
Platform context
YOTCH is a professional crew discovery platform for the superyacht industry. The App is designed exclusively for working adults in maritime employment. Users must be at least 16 years of age to create an account.
The platform connects captains and heads of department with crew members for professional employment purposes. It is not a social platform, dating platform, or consumer-facing community. All registered users are subject to identity and professional verification measures.
Age requirement. Users must be at least 16 years of age. Account creation requires a professional email address or vessel verification. We do not knowingly permit children to use the platform. If we identify a user under 16, their account is deleted immediately.
Notwithstanding the professional nature of the platform, YOTCH maintains full compliance with child safety standards. The absence of child users is not grounds for reduced vigilance — any indication of CSAE or CSAM is treated with the same severity regardless of context.
Standards against CSAE
YOTCH maintains and enforces the following standards against child sexual abuse and exploitation across all aspects of the platform.
No user may upload, share, transmit, or solicit any content that constitutes or facilitates CSAE. This applies to profile photos, in-app messages, shared documents, and any other content transmitted through the platform.
The platform may not be used to groom, solicit, or exploit any child. Any messaging behaviour that indicates grooming will result in immediate account suspension pending investigation.
The platform may not be used to facilitate the trafficking of any person, including any child, for sexual purposes. Any such activity will be reported to law enforcement immediately.
No user may upload, share, or transmit child sexual abuse material of any kind. CSAM is illegal in all jurisdictions in which YOTCH operates. Any CSAM identified on the platform will be reported to NCMEC and relevant authorities.
We enforce a minimum age of 16 for all users. Accounts suspected of being held by minors are suspended and investigated. Users found to have misrepresented their age are permanently banned.
YOTCH will cooperate fully with law enforcement investigations relating to CSAE or CSAM. We will preserve and provide relevant data in response to lawful requests and report known violations proactively.
Prohibited content & behaviour
The following are strictly prohibited on YOTCH and will result in immediate account termination and referral to relevant authorities:
- CSAM — any visual depiction of a minor in sexually explicit conduct, including AI-generated or computer-generated imagery
- Grooming — any attempt to establish inappropriate contact with or trust of a child for the purpose of exploitation
- Sexual solicitation of minors — any request for sexual contact, images, or conversation from a person under 18
- Trafficking facilitation — any content or communication facilitating the trafficking of children for sexual purposes
- Sextortion — threatening to share intimate images or information about a minor to coerce or exploit them
- Age misrepresentation — creating an account or profile that misrepresents the user's age
- Sharing of prohibited material — forwarding, linking to, or otherwise distributing CSAM or CSAE-facilitating content
In-app reporting mechanism
YOTCH provides an in-app mechanism for users to report concerns about safety, inappropriate content, or violations of this policy. Reports can be submitted without leaving the app.
Navigate to Settings in the top right of the app. Available on both crew and captain accounts.
Select "Report a concern" from the menu. You can also report directly from any user profile by tapping the ⋯ menu and selecting "Report".
Choose from the category list. Select "Child safety concern" for any report relating to CSAE or CSAM. Include as much detail as possible.
You will receive confirmation that your report has been received. Child safety reports are reviewed within 24 hours. Your identity is kept confidential.
You can also report child safety concerns directly to our safety team at support@yotch.app. All reports are treated as urgent and reviewed within 24 hours.
Anonymous reporting. You do not need to be a registered YOTCH user to report a concern. Anyone with information about potential CSAE or CSAM on the platform can email support@yotch.app directly.
Handling CSAM
YOTCH has clear procedures for identifying, removing, and reporting child sexual abuse material. These procedures are mandatory and non-discretionary — no employee or contractor has authority to waive them.
The account of the user associated with the CSAM is suspended immediately upon identification, pending investigation. No prior warning is given.
The CSAM is removed from the platform immediately. Removal takes priority over investigation — the content does not remain live while a review is conducted.
We report all known CSAM to the National Center for Missing & Exploited Children (NCMEC) CyberTipline as required under 18 U.S.C. § 2258A, and to the Internet Watch Foundation (IWF) for EU/UK-based material.
We refer the matter to the relevant law enforcement authority — UK police, Europol, or local authorities depending on jurisdiction. We cooperate fully with any resulting investigation.
Following investigation, confirmed violations result in permanent account termination. The user is blocked from creating new accounts.
No discretion. The above procedure is mandatory. Staff are not permitted to exercise discretion in failing to report known CSAM. Failure to follow these procedures is a disciplinary matter and may itself constitute a criminal offence.
Legal compliance
YOTCH complies with all applicable child safety laws across the jurisdictions in which it operates. Key legislation includes:
| Jurisdiction | Legislation | Our obligation |
|---|---|---|
| United Kingdom | Protection of Children Act 1978; Sexual Offences Act 2003; Online Safety Act 2023 | Prohibition of CSAM; reporting to IWF; age verification obligations |
| United States | PROTECT Act 2003; CDA §2258A; CIPA | Mandatory reporting of CSAM to NCMEC CyberTipline |
| European Union | Directive 2011/93/EU; Digital Services Act (DSA) | CSAM reporting; cooperation with national authorities |
| International | UNCRC; Budapest Convention on Cybercrime | Cooperation with international law enforcement requests |
Where laws conflict, YOTCH applies the standard that provides the greatest protection to children.
Safety by design
Beyond reactive enforcement, YOTCH builds safety measures into the platform architecture:
- Minimum age enforcement — users must be at least 16 to create an account; professional verification is required
- Three-tier captain verification — captains must verify via management company email, MMSI, or personal email domain before accessing crew profiles
- Captain-initiated contact only — crew members cannot be contacted unsolicited; all messaging is initiated by verified captains and HODs only
- Profile moderation — uploaded photos are reviewed against platform standards; automated and manual review processes are in place
- No anonymous accounts — all users provide an email address; account creation requires verification
- Reporting accessible from every profile — any profile can be reported in two taps without leaving the app
- Audit logging — all user actions are logged and retained for investigation purposes in the event of a safety concern
Child safety point of contact
YOTCH has a designated child safety point of contact who is responsible for CSAE and CSAM prevention practices and compliance with this policy. This individual is the primary contact for Google Play and law enforcement agencies on matters relating to child safety.
- NCMEC CyberTipline — missingkids.org/cybertipline — report CSAM in the US
- Internet Watch Foundation (IWF) — iwf.org.uk/report — report CSAM in the UK and EU
- UK National Crime Agency (NCA) — ceop.police.uk — report child sexual exploitation to UK law enforcement
- Europol — europol.europa.eu — report to European law enforcement
Policy updates
This policy will be reviewed and updated at least annually, or sooner if required by changes in law, platform functionality, or Google Play policy requirements. The effective date at the top of this page reflects the most recent revision.
Material changes will be communicated to users via in-app notification. The current version of this policy is always available at yotch.app/child-safety.
For questions about this policy, contact support@yotch.app.