These Child Safety Standards apply to the Lavenday application and related services (the "Application"), operated by IceCurve (the "Service Provider"). They set out our policies, procedures, and commitments to combat child sexual abuse and exploitation (CSAE) and child sexual abuse material (CSAM) on our platform, in line with Google Play's Child Safety Standards Policy and widely recognised industry best practices such as those published by the Tech Coalition.
1. Scope & Definitions
CSAE (Child Sexual Abuse and Exploitation) refers to any content or behaviour that sexually exploits, abuses, or endangers a child, including but not limited to grooming, sextortion, trafficking, or otherwise sexually exploiting a minor.
CSAM (Child Sexual Abuse Material) refers to any visual depiction, including photos, videos, drawings, and computer-generated imagery, that involves a minor engaging in sexually-explicit conduct. CSAM is illegal and strictly prohibited on Lavenday.
A minor is any individual under the age of 18, or under the age of majority in the user's jurisdiction, whichever is higher.
2. Age Requirement
Lavenday is currently available only to users aged 18 and over. Users determined to be minors are restricted from downloading or using the Application on Google Play.
If and when Lavenday expands its minimum age to 13 and over, we will implement additional safeguards appropriate to younger users, including but not limited to age-appropriate default privacy settings, limits on direct contact from unknown adults, and (where required) verifiable parental consent in line with COPPA, the GDPR, the UK Age Appropriate Design Code, and other applicable laws.
3. Prohibited Conduct
The following are strictly prohibited on Lavenday. Accounts that engage in any of them will be permanently removed and, where applicable, reported to law enforcement:
- Sharing, soliciting, distributing, or linking to CSAM in any form.
- Grooming: any attempt to build an emotional or romantic connection with a minor for the purpose of sexual exploitation or abuse.
- Sextortion: coercing or threatening a minor to produce sexual content, money, or any other benefit.
- Trafficking, smuggling, or advertising the sexual services of a minor.
- Sexualised comments, role-play, or imagery directed at or involving minors.
- Attempting to contact, meet, or arrange contact with a minor for any sexual purpose.
- Sharing personally identifying information (PII) of a minor without clear lawful basis.
4. In-App Reporting Mechanism
Every post, comment, profile, and message on Lavenday includes an in-app Report option. Users can flag content or accounts they believe violate our child-safety policies without leaving the application. Reports related to child safety are routed to our moderation team with the highest priority.
Users may also contact us directly at support@lavenday.com for any report, question, or concern related to child safety on Lavenday.
5. How We Handle Reports & CSAM
When we receive a report or otherwise obtain actual knowledge of suspected CSAE or CSAM on the Application, we take the following actions in accordance with our published standards and applicable law:
- Remove the content from the Application as quickly as reasonably possible.
- Preserve the content, associated metadata, and account information as required for lawful reporting and for use by investigating authorities.
- Report confirmed CSAM to the National Center for Missing & Exploited Children (NCMEC) CyberTipline and/or to the relevant regional or national authorities, as required by applicable law.
- Suspend or permanently ban the account(s) responsible for the content, and take reasonable steps to prevent the same individual from re-registering.
- Cooperate with law enforcement and child safety organisations in lawful investigations.
Our standards are informed by the Tech Coalition's best practices for combating online child sexual exploitation and abuse.
6. Legal Compliance
Lavenday complies with all applicable child-safety laws in the jurisdictions in which it operates, including (where applicable) the U.S. laws requiring reporting of apparent CSAM to NCMEC, the EU Digital Services Act, the UK Online Safety Act, and equivalent regional legislation. We report CSAM and related offences to the appropriate regional and national authorities as required by law.
7. Child Safety Point of Contact
We maintain a designated point of contact who is ready and able to speak to our CSAM prevention practices and compliance with Google Play's Child Safety Standards Policy.
Role: Child Safety Contact, Lavenday (IceCurve)
Email: support@lavenday.com
8. Changes to These Standards
We may update these Child Safety Standards from time to time to reflect changes in our practices, applicable law, or industry guidance. The latest version will always be available at this URL. Material changes will be reflected in the effective date below.
These Child Safety Standards are effective as of 2026-04-20.
9. Contact Us
For any questions, feedback, or concerns about these standards, including matters specifically concerning child safety, please contact us at support@lavenday.com.