AI Data Quality & Engineering Lead

3 - 8 years

5 - 8 Lacs

Posted:4 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Why this role exists
As AI systems scale rapidly across industries, the integrity and accuracy of testing, training, and evaluation data have never been more critical. TaskUs needs a proactive leader who can architect and uphold highquality annotation workflows so that AI models are built and evaluated on, reliable data without compromising on speed or efficiency.The impact youll make:

  • Build and guide a high-performing team: Lead and mentor a team of Data Quality Analysts, setting clear quality goals, delivering feedback, and fostering a culture of precision and accountability.
  • Ensure quality at scale: Develop and continually refine robust QA processes, SOPs, and statistical quality metrics (e.g., F1 score, interannotator agreement) to protect the integrity of annotation outputs.
  • Drive transparency and insight: Create dashboards and reports that reveal quality trends, root causes of errors, and improvement opportunities - communicating these insights to leadership and clients.
  • Champion tool innovation and efficiency: Manage annotation and QA platforms (like Labelbox, Dataloop, LabelStudio), and lead the evaluation or implementation of new automation tools to elevate efficiency and maintain quality.

Responsibilities:
Strategic LeadershipDrive the development, refinement, and documentation of quality assuranceprocesses and standard operating procedures to ensure high-quality outputs.Establish comprehensive quality metrics (e.g. F1 score, inter-annotator agreement)that align with business objectives and industry standards.Continuously review and refine annotation workflows to proactively identify risks andareas to increase efficiency and reduce errors.Act as the subject matter expert on annotation quality, providing ongoing feedback,training, and support to annotators and project teams to uphold the highest qualitystandards.Analysis & ReportingLead in-depth data analysis to diagnose quality issues, assess the effectiveness ofquality strategies, and uncover root causes of recurring errors.Develop and maintain dashboards that provide real-time insights into quality metricsand project performance.Prepare and deliver strategic quality reports to senior management and clients,articulating quality trends, risks, and improvement plans.Partner with cross-functional teams, including operational management, engineering,and client services, to align on project goals and quality assurance initiatives.Operational LeadershipLead a team of Data Quality Analysts and provide mentorship, training, andexpertise, fostering a culture of continuous improvement and accountability.Manage the configuration and integration of annotation and quality control tools (e.g.Labelbox, Dataloop, LabelStudio), ensuring optimal tool performance and alignmentwith project requirementsIdentify, evaluate, and implement innovative quality control tools and automationtechnologies to streamline quality control workflows, enhance analytical capabilities,and improve operational efficiency.

Required Qualifications
Bachelors degree in a technical field (e.g. Computer Science, Data Science) orequivalent professional experience.3+ years of experience in data quality management, data operations, or related roleswithin AI/ML or data annotation environments.Proven track record in designing and executing quality assurance strategies forlarge-scale, multi-modal data annotation projects.Proven track record in a leadership role managing and developing high-performing,remote or distributed teams.Deep understanding of data annotation processes, quality assurance methodologies,and statistical quality metrics (e.g., F1 score, inter-annotator agreement).Strong data-analysis skills, with the ability to interrogate large datasets, performstatistical analyses, and translate findings into actionable recommendations.Excellent communication skills, with experience presenting complex data and qualityinsights to technical and non-technical stakeholders.Proficiency with annotation and QA tools (e.g., Labelbox, Dataloop, LabelStudio).High-level of proficiency in common data-analysis tools, such as Excel and GoogleSheets.Familiarity with programmatic data analysis techniques (e.g. Python, SQL).Familiarity with the core concepts of AI/ML pipelines, including data preparation,model training, and evaluation.Preferred QualificationsPrior experience in an agile or fast-paced tech environment with exposure to AI/MLpipelines.Experience in a managed services or vendor-driven environment.Familiarity with prompt engineering and large-language-model assisted workflows tooptimise annotation and validation processes.In-depth knowledge of ethical AI practices and compliance frameworks.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Taskus logo
Taskus

Outsourcing and Offshoring Consulting

New Braunfels Texas

RecommendedJobs for You