-
Analyze, categorize, and either approve or remove user-generated content based on established safety protocols and client standards, utilizing dedicated internal software.
-
Proactively stay current with all updates, modifications, and nuances in client content policies and moderation guidelines.
-
Investigate and resolve complex or challenging content violations, and effectively communicate high-stakes issues to senior members of the Trust and Safety team.
-
Act as a dedicated advocate for the platform's user community, ensuring their safety and positive experience.
-
Contribute ideas and actively participate in projects aimed at refining operational workflows, improving both the speed and quality of moderation tasks.
-
Attend required training sessions and engage in workgroup discussions to foster professional growth and ensure optimal performance in the role.
-
Actively participate in discussions involving sensitive social topics to ensure that the platform maintains a secure and non-toxic environment for all users.

Niemcy
Wielka Brytania
Holandia
Irlandia
Norwegia
Szwajcaria