Trust & Safety Analyst

3 - 5 years

3 - 6 Lacs

Posted:5 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description


 About The Role  
Skill required: User-Generated Content Moderation - Content Moderation
Designation: Trust & Safety Analyst
Qualifications:Any Graduation
Years of Experience:3 to 5 years
What would you do? "Content moderation is meaningful work that helps keep the internet safe. It may also be challenging, at times. In the context of this role, individuals may be directly or inadvertently exposed to potentially objectionable and sensitive content (e.g., graphic, violent, sexual, or egregious). Therefore, content moderators need strong resilience and coping skills. We care for the health and well-being of our people and provide the support and resources needed to perform their role responsibilities. Active participation in Accentures well-being support program, designed specifically for the Trust & Safety community, provides valuable skills to promote individual and collective well-being. ""Identify the spam content to provide genuine search results. Help the victims to remove their explicit videos / Photographs from global sites. Remove the Personal Identifiable information reported by the user from various search results. Addressing ad blocking through improving ad experiences across the web.Reviewing of photos, videos, and text-based content and make judgments as to whether reviewed content is in violation of our Clients terms of services. The content may cover may be sensitive in nature. Ensuring every piece of content in violation of clients terms of services is accurately identified and flagged for action in a timely manner. ""1. Review videos all workflow for violations of policies to ensure consistent implementation2. Have a deep understanding of policies and guidelines guidelines, and how to interpret them in order to enforce on standard and non-standard situations when needed.3. Comprehend the policy and community guidelines to take informed decisions that balance the user safety and platform integrity. "What are we looking for?
"We are seeking highly analytical and detail-oriented individuals to join a specialized team focused on misinformation detection and GenAI-manipulated media assessment. This cross-functional role plays a vital part in supporting efforts to maintain platform integrity, reduce user exposure to harmful or deceptive content, and support compliance with evolving global standards. As an Altered and Synthetic Content Assessment Analyst you will assess whether contentparticularly videohas been synthetically altered using Generative AI or contains misleading narratives. Your structured evaluations will feed into the final decision-making processes led by the client full-time content policy teams. ""Minimum Skills?2+ years Experience in content moderation, investigative journalism, or media analysis preferred. ?Familiarity with social listening tools, content evaluation platforms, or basic NLP/data analysis tools a plus?Exceptional verbal and written skills, with the ability to collaborate effectively across global teams and present to senior stakeholders?Strong written analysis and documentation skills. Ability to clearly summarize complex narratives and technical findings?Excellent critical thinking, attention to detail, and ability to apply nuanced policy in ambiguous situations. Good-to-Have Skills?Domain Knowledge:Strong understanding of content Moderation?Creator Economy Knowledge:Experience with influencer marketing, creator partnerships, or branded content platforms.
  • Problem-Solving Mindset:Track record of identifying process inefficiencies and implementing scalable solutions"
    Roles and Responsibilities: "?Analyze emerging and existing content for misleading or deceptive elements in alignment and manipulated media policies. ?Conduct contextual verification by evaluating people, events, and circumstances depicted in the content using internal tools and limited open-source checks. ?Identify high-risk narratives (e.g., health misinformation, election manipulation, deepfakes) based on severity, spread potential, and platform impact. ?Identifying altered/fabricated video content and highlighting content.
  • Use structured internal tools and workflows to submit findings; your role is advisory, not enforcement based. ?Conduct plausibility reviews to help determine whether individuals, speech, or events in the video are potentially altered or misrepresented. ?Submit detailed, policy-aligned assessments with clear rationale and supporting context for review to clients. ?Track patterns and trends across flagged content to support broader strategy and detection tooling. ?Maintain consistent quality and adhere to established benchmarks for accuracy, neutrality, and timeliness."
     Qualification Any Graduation
  • Mock Interview

    Practice Video Interview with JobPe AI

    Start Digital Marketing Interview
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Skills

    Practice coding challenges to boost your skills

    Start Practicing Now
    Accenture logo
    Accenture

    Professional Services

    Dublin

    RecommendedJobs for You