Jobs
Interviews

2 Airflow. Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

18 - 25 Lacs

Chennai

Work from Office

Key Skills : Python, SQL, PySpark, Databricks, AWS, Data Pipeline, Data Governance, Data Security, Leadership, Cloud Platforms, Life Sciences, Migration, Airflow. Roles & Responsibilities : Lead a team of data engineers and developers, defining technical strategy, best practices, and architecture for data platforms. Architect, develop, and manage scalable, secure, and high-performing data solutions on AWS and Databricks. Oversee the design and development of robust data pipelines for ingestion, transformation, and storage of large-scale datasets. Enforce data validation, lineage, and quality checks across the data lifecycle, defining standards for metadata, cataloging, and governance. Design automated workflows using Airflow, Databricks Jobs/APIs, and other orchestration tools for end-to-end data operations. Implement performance tuning strategies, cost optimization best practices, and efficient cluster configurations on AWS/Databricks. Define and enforce data security standards, IAM policies, and ensure compliance with industry-specific regulatory frameworks. Work closely with business users, analysts, and data scientists to translate requirements into scalable technical solutions. Drive strategic data migrations from on-prem/legacy systems to cloud-native platforms with minimal risk and downtime. Mentor junior engineers, contribute to talent development, and ensure continuous learning within the team. Experience Requirement : 6-9 years of hands-on experience in data engineering with expertise in Python, SQL, PySpark, Databricks, and AWS. Strong leadership experience in data engineering or data architecture roles, with a proven track record in leading teams and delivering large-scale data solutions. Expertise in designing and developing data pipelines, optimizing performance, and ensuring data quality. Solid experience with cloud platforms (AWS, Databricks), data governance, and security best practices. Experience in data migration strategies and leading transitions from on-premises to cloud-based environments. Experience in the Life Sciences or Pharma domain is highly preferred, with a deep understanding of industry-specific data requirements. Strong communication and interpersonal skills with the ability to collaborate across teams and engage stakeholders. Education : Any Graduation.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

0 - 3 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role: Data Engineer Experience: 8+ Years Location: Hyderabad, Bengaluru, Chennai, Pune, Ahmedabad and Noida Notice Period: 30 Days What You'll Do Provide the organizations data consumers high quality data sets by data curation, consolidation, and manipulation from a wide variety of large scale (terabyte and growing) sources. Build first-class data products and ETL processes that interact with terabytes of data on leading platforms such as Snowflake and BigQuery. Partner with our Analytics, Product, CRM, and Marketing teams. Be responsible for the data pipelines SLA and dependency management. Write technical documentation for data solutions, and present at design reviews. Solve data pipeline failure events and implement anomaly detection. Work with various teams from Data Science, Product owners, to Marketing and software engineers on data solutions and solving technical challenges. Mentor junior members of the team What We Seek Education and Work Experience Bachelor’s degree in Computer Science or related field 8+ Years experience in commercial data engineering or software development. Tech Experience Experience with Big Data technologies such as Snowflake, Databricks, PySpark Expert level skills in writing and optimizing complex SQL; advanced data exploration skills with proven record of querying and analyzing large datasets. Solid experience developing complex ETL processes from concept to implementation to deployment and operations, including SLA definition, performance measurements and monitoring. Hands-on knowledge of the modern AWS Data Ecosystem, including AWS S3 Experience with relational databases such as Postgres, and with programming languages such as Python and/or Java Knowledge of cloud data warehouse concepts. Experience in building and operating data pipelines and products in compliance with the data mesh philosophy would be beneficial. Demonstrated efficiency in treating data, including data lineage, data quality, data observability and data discoverability. Communication/people skills Excellent verbal and written communication skills. Ability to convey key insights from complex analyses in summarized business terms to non-technical stakeholders and also ability to effectively communicate with other technical teams. Strong interpersonal skills and the ability to work in a fast-paced and dynamic environment. Ability to make progress on projects independently and enthusiasm for solving difficult problems. Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies