4 - 9 years

5 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

What We Look for in You

A data-focused mindset with a passion for building reliable, scalable, and validated data pipelines A strong understanding of both data engineering and data quality assurance practices A meticulous eye for detail and a proactive attitude toward ensuring end-to-end data integrity The ability to collaborate with diverse stakeholders across engineering, analytics, and business A commitment to delivering clean, secure, and accurate data for decision-making

What You'll Be Doing

Design, develop, and maintain robust ETL/ELT pipelines to process large volumes of structured and unstructured data using Azure Data Factory, PySpark, and SQL-based tools Collaborate with data architects and analysts to understand transformation requirements and implement business rules correctly Develop and execute complex SQL queries to validate, transform, and performance-tune data workflows Perform rigorous data validation including source-to-target mapping (S2T), data profiling, reconciliation, and transformation rule testing Conduct unit, integration, regression, and performance testing for data pipelines and storage layers Automate data quality checks using Python and frameworks like Great Expectations, DBT, or custom-built tools Monitor data pipeline health and implement observability through logging, alerting, and dashboards Integrate testing into CI/CD workflows using tools like Azure DevOps, Jenkins, or GitHub Actions Troubleshoot and resolve data quality issues, schema changes, and pipeline failures Ensure compliance with data privacy, security, and governance policies Maintain thorough documentation for data flows, test logic, and validation processes

What You'll Bring to the Table

  • 4+ years of experience in Data Engineering and Data/ETL Testing Strong expertise in writing and optimizing SQL queries (joins, subqueries, window functions, performance tuning) Proficiency in Python or PySpark for data transformation and automation Hands-on experience with ETL tools such as Azure Data Factory, Talend, SSIS, or Informatica Familiarity with cloud platforms, preferably Azure; AWS or GCP is a plus Experience working with data lakes, data warehouses (Snowflake, BigQuery, Redshift), and modern data platforms
  • Knowledge of version control systems (Git), issue tracking tools (JIRA), and Agile methodologies Exposure to data testing frameworks like Great Expectations, DBT tests, or custom validation tools Experience integrating data testing into CI/CD pipelines Nice to have: Familiarity with Airflow, Databricks, BI tools (Power BI, Tableau), and metadata management practices

Core benefits youll gain

  • Competitive salary aligned with industry standards Hands-on experience with enterprise-scale data platforms and cloud-native tools Opportunities to work on data-centric initiatives across AI, analytics, and enterprise transformation
  • Access to internal learning accelerators, mentorship, and career growth programs Flexible work culture, wellness initiatives, and comprehensive health benefits.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Optisol Business Solutions logo
Optisol Business Solutions

Information Technology and Services

Austin

RecommendedJobs for You