Job Description
Key Responsibilities:
• Conduct audits and quality checks on content moderation tasks performed by moderators.
• Evaluate accuracy, consistency, and compliance of moderation decisions based on platform/community guidelines.
• Identify process gaps, errors, and improvement opportunities in moderation workflows.
• Provide constructive feedback and coaching to moderation teams to enhance quality and performance.
• Collaborate with training, operations, and policy teams to ensure updated knowledge and practices.
• Track and report key quality metrics (QA scores, error trends, etc.) regularly.
• Participate in calibration sessions to align understanding and application of guidelines across teams.
• Support policy updates and help implement changes within the moderation process.
Requirements:
• Bachelor’s degree in any field.
• 2+ years of experience in content moderation, trust & safety, or a related QA role.
• Strong understanding of online safety, content standards, and moderation challenges.
• Excellent analytical, decision-making, and problem-solving skills.
• Strong written and verbal communication.
• Experience with QA tools and reporting systems is a plus.
• Ability to work independently and handle sensitive content with professionalism.
Preferred Qualifications:
• Experience in social media or user-generated content platforms.
• Familiarity with international content moderation standards.
• Multilingual proficiency is a plus.