Job Summary
The Content Moderator (Trust & Safety Specialist) is responsible for reviewing and monitoring user-generated content to ensure compliance with platform policies, legal standards, and community guidelines. This role is essential in maintaining a safe, respectful, and engaging online environment by identifying and removing content that violates established rules.
Minimum Qualifications
- Degree required
- Experience Level: Mid-Level
- Minimum Experience: 4 years
Key Responsibilities
- Review and take action on reported content, including text, images, and videos, ensuring alignment with platform policies. Focus on high-priority queues and complex edge cases requiring human judgment.
- Monitor daily moderation queues to detect emerging abuse patterns such as new spam tactics or coordinated harassment campaigns and escalate findings to the Policy team promptly.
- Provide feedback on moderation tool performance and recommend workflow improvements to increase efficiency without compromising accuracy.
- Maintain a minimum 95% accuracy rate on moderation decisions. Participate in calibration sessions to ensure consistent policy enforcement.
- Offer constructive input to Policy teams when guidelines lack clarity or conflict with real-world context, contributing to improved moderation frameworks.
- Investigate content removals and account suspensions, making final decisions on reinstatement requests with fairness and due process.
- Act as a designated responder during critical “red alert” situations, including graphic live-streamed incidents or coordinated abuse campaigns.
Requirements
- At least 4 years of experience in Content Moderation, Trust & Safety Operations, or Community Management within a major technology or social media platform.
- Strong ability to detect subtle policy violations that automated systems may overlook, such as coded hate symbols embedded in images.
- Proficiency with moderation tools (e.g., Hive, Besedo, Salesforce) and Google Workspace.
- Experience managing high content volumes during global events or viral trends.
- High emotional resilience and the ability to review sensitive content, including violence, hate speech, and adult material, while maintaining personal digital wellness strategies.
- Deep understanding of regional and cultural nuances, historical contexts, and legal considerations. Ability to differentiate between hate speech and protected political expression, or extremist propaganda and legitimate documentary or news content.
- Demonstrated ability to maintain quality metrics while processing high volumes of content (80–100+ items per hour).
- Strong focus and attention to detail during repetitive tasks.
Remuneration
NGN 500,000 per month.