User Safety & Content Moderation Policy

At Lakku.ai, we are committed to protecting minors from child sexual abuse and exploitation (CSAE), online harm, and inappropriate content.
We implement strict moderation policies, AI-powered detection tools, and user safety measures to ensure a secure and respectful environment.

This policy outlines our approach to preventing CSAE, moderating harmful content, and providing users with reporting tools.

CSAE Prevention Measures & AI-Based Moderation
AI-Powered Content Moderation

To detect and prevent CSAE-related content, we use:

Strict Zero-Tolerance Policy:

Our AI models are continuously improved to provide better content filtering and user protection.

Age Restrictions & Protection for Minors
Age Requirement:
Protection for Minors:
Reporting CSAE Content, Harassment & Abuse

We provide multiple ways for users to report child safety concerns:

In-App Reporting (Coming soon)

Users will soon be able to report content directly within the app, flagging:

Email Support

Users can report any child safety concerns via email: safety@lakku.ai

Legal & Law Enforcement Reports

If required, we cooperate with authorities and child protection organizations to address CSAE violations.

For law enforcement inquiries please email to legal@lakku.ai

CSAE-related reports are reviewed as a priority and escalated immediately if needed.

Content & Behavior Strictly Prohibited

We strictly prohibit the following:

Users violating these rules will be reported to law enforcement and permanently removed from the platform.

Contact Information for CSAE & Safety Issues

For safety-related concerns, please contact us:
Email: safety@lakku.ai.

We are dedicated to protecting minors and ensuring a safe, abuse-free digital environment.