SAFETY & MODERATION POLICY

Last updated: October 28, 2025

Goals

Prevent harm, protect users, and maintain trust across the platform.

Moderation Tools & Processes

  • Automated detection: profanity filters, sexual content and hate speech flags.
  • Human review: a Safety Team reviews flagged items and incidents.
  • Escalate button: in-call emergency disconnect and alert to Safety inbox.
  • KYC & verification: Companions are KYC-verified; repeat-flagged users may be asked to verify identity.

Incident Handling

  • Flag/Escalate → immediate disconnect if required.
  • Triage within 15 minutes by Safety Team.
  • Action: warning, temporary suspension, permanent ban, refund, or referral to authorities.
  • Communicate: affected parties notified with appeals details.

Emergency & Self-harm

MURM is not an emergency service. If a participant expresses imminent self-harm or harm to others, MURM will attempt to provide local emergency resources and may, where legally permitted, notify relevant authorities.

Transparency

We maintain an incident log and publish anonymized safety metrics periodically (number of escalations, average response time, actions taken).