Home
Jobs

1 Firetran Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

15 - 30 Lacs

Hyderabad

Remote

Naukri logo

Position: Lead Data Engineer Experience: 7+ Years Location: Hyderabad | Chennai | Remote Summary We are seeking a Lead Data Engineer with 7+ years of experience to lead the development of ETL pipelines, data warehouse solutions, and analytics infrastructure. The ideal candidate will have strong experience in Snowflake , Azure Data Factory , dbt , and Fivetran , with a background in managing data for analytics and reporting, particularly within the healthcare domain. Responsibilities Design and develop ETL pipelines using Fivetran , dbt , and Azure Data Factory for internal and client projects involving platforms such as Azure , Salesforce , and AWS . Monitor and manage production ETL workflows and resolve operational issues proactively. Document data lineage and maintain architecture artifacts for both existing and new systems. Collaborate with QA and UAT teams to produce clear, testable mapping and design documentation. Assess and recommend data integration tools and transformation approaches. Identify opportunities for process optimization and deduplication in data workflows. Implement data quality checks in collaboration with Data Quality Analysts. Contribute to the design and development of large-scale Data Warehouses , MDM solutions , Data Lakes , and Data Vaults . Required Skills & Qualifications Bachelor's Degree in Computer Science, Software Engineering, Mathematics, or a related field. 6+ years of experience in data engineering, software development, or business analytics. 5+ years of strong hands-on SQL development experience. Proven expertise in: Snowflake Azure Data Factory (ADF) ETL tools such as Informatica , Talend , dbt , or similar. Experience in the healthcare industry , with understanding of PHI/PII requirements. Strong analytical and critical thinking skills. Excellent communication and interpersonal abilities. Proficient in scripting or programming languages such as Python , Perl , Java , or Shell scripting on Linux/Unix environments. Familiarity with BI/reporting tools like Power BI , Tableau , or Cognos . Experience with Big Data technologies such as: Snowflake (Snowpark) Apache Spark , Hadoop , MapReduce , Sqoop , Hive , Pig , HBase , Flume .

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies