The Merchant must support a complaint process that allows for the reporting of content that may be illegal or otherwise violates the Standards and must review and resolve all reported complaints within seven (7) business days. In the event that such review yields evidence of illegal content, the Merchant must remove that content immediately.
A dedicated priority content moderation workflow ensures swift action on all flagged content, resulting in any potentially illegal or harmful content being reviewed and acted upon within seven working days.
How it works
Complaints and reported content are handled within seven working days via a priority complaint handling flow to maintain your platform's compliance and maintain your platform's health, safety, and integrity.
Depending on the nature of the violation, the user may need to verify their identity with a government-issued identity document. The individual will need to state what violation has occurred in all cases.
Feature focus
Combining artificial intelligence with human judgement optimises your complaint resolution workflow to make swift and accurate decisions, creating a platform your users can trust.
Maintain platform health via a robust, multi-pronged content moderation tech stack ensuring published content has been diligently screened for violations.
Fostering a safe experience for all users with a participant consent and verification process, combined with content moderation quality control & complaint resolution.
By manually reviewing a sample set of all content, we can ensure the correct moderation decisions are being made, and the health of your content is maintained.
Explore the wide range of safeguarding and compliance features that work together to protect your business and its users.
Any questions? Let's chat. Our dedicated team are always on-hand to discuss identity and content verification.