Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
10 - 18 Lacs
Bengaluru, Mumbai (All Areas)
Hybrid
About the Role: We are seeking a passionate and experienced Subject Matter Expert and Trainer to deliver our comprehensive Data Engineering with AWS program. This role combines deep technical expertise with the ability to coach, mentor, and empower learners to build strong capabilities in data engineering, cloud services, and modern analytics tools. If you have a strong background in data engineering and love to teach, this is your opportunity to create impact by shaping the next generation of cloud data professionals. Key Responsibilities: Deliver end-to-end training on the Data Engineering with AWS curriculum, including: - Oracle SQL and ANSI SQL - Data Warehousing Concepts, ETL & ELT - Da...
Posted 2 months ago
6.0 - 11.0 years
0 - 2 Lacs
Chennai
Work from Office
Requirement 1: Skills: AWS Redshift dev with Apache Airflow Location: Chennai Experience: 8+ Years Work Mode: Hybrid. Role & responsibilities: Senior Data Engineer AWS Redshift & Apache Airflow Location: Chennai Experience Required: 8+ Years Job Summary We are seeking a highly experienced Senior Data Engineer to lead the design, development, and optimization of scalable data pipelines using AWS Redshift and Apache Airflow. The ideal candidate will have deep expertise in cloud-based data warehousing, workflow orchestration, and ETL processes, with a strong background in SQL and Python. Key Responsibilities Design, build, and maintain robust ETL/ELT pipelines using Apache Airflow. Integrate da...
Posted 2 months ago
3.0 - 6.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Should have developed/Worked for atleast 1 Gen AI project. Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledg...
Posted 2 months ago
1.0 - 3.0 years
4 - 9 Lacs
Hyderabad
Work from Office
Key Responsibilities: Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional/non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. using Python/open source technologies. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support thei...
Posted 2 months ago
15.0 - 20.0 years
10 - 14 Lacs
Noida
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS BigData Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategi...
Posted 2 months ago
6.0 - 10.0 years
15 - 30 Lacs
Kolkata, Mumbai (All Areas)
Work from Office
Experience-6 to 10 Years Job Locations-Kolkata & Mumbai Notice Period-30 Days Job Role-ETL Lead ETL Lead with strong AWS expertise AWS Glue, Lambda, RDS (e.g. MySQL) Primary Responsibilities: • 6 to 9 years of experience in data engineering or ETL development. • Proven expertise in AWS Glue, Lambda, S3, and RDS (MySQL) for ETL workflows. • Strong SQL and Python/PySpark development skills. • Solid understanding of data warehousing concepts and data modeling (star/snowflake schemas). • Experience delivering data solutions consumed by Power BI dashboards. • Ability to lead and manage a small team of developers. • Understanding of data modeling concepts and dimensional models (star/snowflake sch...
Posted 2 months ago
8.0 - 13.0 years
7 - 14 Lacs
Pune, Mumbai (All Areas)
Hybrid
Job Title: Lead Data Engineer Location: Mumbai / Pune Experience: 8+ yrs Job Summary: We are seeking a technically strong and delivery-focused Lead Engineer to support and enhance enterprise-grade data and application products under the Durables model. The ideal candidate will act as the primary technical interface for the client, ensuring high system availability, performance, and continuous improvement. This role requires a hands-on technologist with strong team management experience, cloud (AWS) expertise, and excellent communication skills to handle client interactions and drive technical decisions. Key Responsibilities: Support & Enhancement Leadership Act as the primary technical lead ...
Posted 2 months ago
5.0 - 8.0 years
7 - 10 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Data Governance, Lakehouse architecture, Medallion Architecture, Azure DataBricks, Azure Synapse, Data Lake Storage, Azure Data Factory Intelebee LLC is Looking for: Data Engineer:We are seeking a skilled and hands-on Cloud Data Engineer with 5-8 years of experience to drive end-to-end data engineering solutions. The ideal candidate will have a deep understanding of dimensional modeling, data warehousing (DW), Lakehouse architecture, and the Medallion architecture. This role will focus on leveraging Azure's/AWS ecosystem to build scalable, efficient, and s...
Posted 2 months ago
7.0 - 8.0 years
9 - 10 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, PySpark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform - Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and obse...
Posted 2 months ago
7.0 - 12.0 years
15 - 30 Lacs
Gurugram, Delhi / NCR
Work from Office
Job Description We are seeking a highly skilled Senior Data Engineer with deep expertise in AWS data services, data wrangling using Python & PySpark, and a solid understanding of data governance, lineage, and quality frameworks. The ideal candidate will have a proven track record of delivering end-to-end data pipelines for logistics, supply chain, enterprise finance, or B2B analytics use cases. Role & responsibilities. Design, build, and optimize ETL pipelines using AWS Glue 3.0+ and PySpark. Implement scalable and secure data lakes using Amazon S3, following bronze/silver/gold zoning. Write performant SQL using AWS Athena (Presto) with CTEs, window functions, and aggregations. Take full own...
Posted 2 months ago
5.0 - 10.0 years
20 - 25 Lacs
Gurugram
Work from Office
Role & responsibilities Key Responsibilities Design, build, and maintain scalable and efficient data pipelines to move data between cloud-native databases (e.g., Snowflake) and SaaS providers using AWS Glue and Python Implement and manage ETL/ELT processes to ensure seamless data integration and transformation Ensure information security and compliance with data governance standards Maintain and enhance data environments, including data lakes, warehouses, and distributed processing systems Utilize version control systems (e.g., GitHub) to manage code and collaborate effectively with the team Primary Skills: Enhancements, new development, defect resolution, and production support of ETL devel...
Posted 2 months ago
7.0 - 12.0 years
15 - 30 Lacs
Gurugram
Hybrid
Job Description We are seeking a highly skilled Senior Data Engineer with deep expertise in AWS data services, data wrangling using Python & PySpark, and a solid understanding of data governance, lineage, and quality frameworks. The ideal candidate will have a proven track record of delivering end-to-end data pipelines for logistics, supply chain, enterprise finance, or B2B analytics use cases. Role & responsibilities Design, build, and optimize ETL pipelines using AWS Glue 3.0+ and PySpark. Implement scalable and secure data lakes using Amazon S3, following bronze/silver/gold zoning. Write performant SQL using AWS Athena (Presto) with CTEs, window functions, and aggregations. Take full owne...
Posted 2 months ago
7.0 - 12.0 years
1 - 2 Lacs
Hyderabad
Remote
Role & responsibilities We are looking for a highly experienced Senior Cloud Data Engineer to lead the design, development, and optimization of our cloud-based data infrastructure. This role requires deep technical expertise in AWS services, data engineering best practices, and infrastructure automation. You will be instrumental in shaping our data architecture and enabling data-driven decision-making across the organization. Key Responsibilities: Design, build, and maintain scalable and secure data pipelines using AWS Glue , Redshift , and Python . Develop and optimize SQL queries and stored procedures for complex data transformations and migrations. Automate infrastructure provisioning and...
Posted 2 months ago
3.0 - 8.0 years
6 - 16 Lacs
Noida
Remote
Job Title : Freelancer Informatica to AWS Glue Migration Employment Type : Freelancer / Contractual (Remote) Location : Remote Job Description: We are seeking a skilled freelance Data Engineer / ETL Developer to support the migration of ETL pipelines from Informatica to AWS Glue . Responsibilities : Analyze existing Informatica workflows and mappings. Design and implement equivalent AWS Glue jobs using PySpark or Glue Studio. Migrate data transformation logic to AWS Glue and integrate with services like S3, Redshift, Lambda, and CloudWatch. Ensure performance optimization and data quality. Deliver documentation and provide post-migration support. Key Skills : Informatica PowerCenter / Cloud....
Posted 2 months ago
7.0 - 12.0 years
30 - 40 Lacs
Hyderabad
Work from Office
Support enhancements to the MDM platform Develop pipelines using snowflake python SQL and airflow Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years of hands on expert level Snowflake, Python, orchestration tools like Airflow Good understanding of investment domain Experience with dbt, Cloud experience (AWS, Azure) DevOps
Posted 2 months ago
4.0 - 9.0 years
15 - 25 Lacs
Hyderabad
Work from Office
python, experienced in performing ETL andData Engineering concepts (PySpark, NumPy, Pandas, AWS Glue and Airflow)SQL exclusively Oracle hands-on work experience SQL profilers / Query Analyzers AWS cloud related (S3, RDS, RedShift) ETLPython
Posted 2 months ago
4.0 - 9.0 years
8 - 16 Lacs
Kolkata
Remote
Enhance/modify applications, configure existing systems and provide user support DotNET Full Stack or [Angular 18 + developer + DotNET backend] SQL Server Angular version 18+ (it is nice to have) Angular version 15+ (mandatory)
Posted 2 months ago
7.0 - 12.0 years
25 - 40 Lacs
Bengaluru
Hybrid
Role - Data Engineer Experience - 7+ Years Notice - Immediate Skills - AWS (S3, Glue, Lambda, EC2), Spark, Pyspark, Python, Airflow
Posted 2 months ago
2.0 - 7.0 years
20 - 35 Lacs
Pune
Remote
As a software engineer focused on Marketing and Customer Engagement at GoDaddy, you will have the opportunity to design, build, and maintain a platform that is a keystone to our customer experience, marketing, and business objectives. Everything we do starts with data. Ensure our team continues with a Shift Left” focus on security. This includes the design and development of systems that can contain sensitive customer information. You will partner closely and collaborate with other GoDaddy teams of Engineers, Marketing Professionals, QA and Operations teams. Leverage industry best practices and methodologies such as Agile, Scrum, testing automation and Continuous Integration and Deployment. ...
Posted 2 months ago
5.0 - 9.0 years
8 - 12 Lacs
Noida
Work from Office
5-9 years In Data Engineering, software development such as ELT/ETL, data extraction and manipulation in Data Lake/Data Warehouse environment Expert level Hands to the following: Python, SQL PySpark DBT and Apache Airflow DevOps, Jenkins, CI/CD Data Governance and Data Quality frameworks Data Lakes, Data Warehouse AWS services including S3, SNS, SQS, Lambda, EMR, Glue, Athena, EC2, VPC etc. Source code control - GitHub, VSTS etc. Mandatory Competencies Python - Python Database - SQL Data on Cloud - AWS S3 DevOps - CI/CD DevOps - Github ETL - AWS Glue Beh - Communication
Posted 2 months ago
3.0 - 8.0 years
15 - 30 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 15 to 30 LPA Exp: 3 to 8 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 3+ years of experience in analytics, Pyspark, Python, Spark, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and p...
Posted 2 months ago
0.0 - 3.0 years
1 - 4 Lacs
Bengaluru
Work from Office
Job Title: Data Engineer Company Name: Kinara Capital Job Description: As a Data Engineer at Kinara Capital, you will play a critical role in building and maintaining the data infrastructure necessary for effective data analysis and decision-making. You will collaborate with data scientists, analysts, and other stakeholders to support data-driven initiatives. Your primary responsibilities will include designing and implementing robust data pipelines, ensuring data quality and integrity, and optimizing data storage and retrieval processes. Key Responsibilities: - Develop, construct, test, and maintain data architectures including databases and large-scale processing systems. - Create and mana...
Posted 2 months ago
6.0 - 11.0 years
20 - 35 Lacs
Gurugram
Hybrid
Must-Have Skills (Core Requirements) Look for resumes that mention hands-on experience with: Amazon S3 storing and organizing data AWS Glue – running ETL jobs (basic PySpark knowledge is a plus) Glue Catalog – maintaining metadata for datasets Amazon Athena – querying data using SQL Parquet or CSV – basic familiarity with data file formats AWS Lambda – for simple automation or triggers Basic IAM knowledge – setting up access permissions CloudWatch – monitoring jobs or logs Understanding of ETL/ELT pipelines Good-to-Have Skills (Preferred but not mandatory) These add value, but are not essential at this level: AWS Lake Formation – access control and permissions Apache Airflow or Step Function...
Posted 2 months ago
2.0 - 5.0 years
10 - 17 Lacs
Chennai, Bengaluru
Work from Office
Role & responsibilities Have strong data engineering knowledge and cloud development exposure. Proficiency in python Proficiency in both RDBMS (Mysql preferred) and NoSql datastores Spark, Cassandra, AWS data pipeline stack (Athena, S3, Glue data catalog etc.), Airflow etc are technologies you have used. Very comfortable with data lakes, warehouses, and ETL/ELT paradigms Worked in an agile development environment.• Optional Basic knowledge of statistical analysis, mathematical modelling and machine learning Experience Have used or are very hands-on with Microservices, Docker, Kubernetes, Gradle/Ant,Kafka, GIT/bitbucket in an agile workplace. Develop high quality code with strong unit/integra...
Posted 2 months ago
2.0 - 4.0 years
10 - 11 Lacs
Pune
Work from Office
Role & responsibilities Have strong data engineering knowledge and cloud development exposure. Proficiency in python Proficiency in both RDBMS (Mysql preferred) and NoSql datastores Spark, Cassandra, AWS data pipeline stack (Athena, S3, Glue data catalog etc.), Airflow etc are technologies you have used. Very comfortable with data lakes, warehouses, and ETL/ELT paradigms Worked in an agile development environment.• Optional Basic knowledge of statistical analysis, mathematical modelling and machine learning Experience Have used or are very hands-on with Microservices, Docker, Kubernetes, Gradle/Ant,Kafka, GIT/bitbucket in an agile workplace. Develop high quality code with strong unit/integra...
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
81102 Jobs | Dublin
Wipro
28851 Jobs | Bengaluru
Accenture in India
24265 Jobs | Dublin 2
EY
21926 Jobs | London
Uplers
15950 Jobs | Ahmedabad
IBM
15060 Jobs | Armonk
Bajaj Finserv
14778 Jobs |
Amazon.com
14002 Jobs |
Accenture services Pvt Ltd
13694 Jobs |
Capgemini
13629 Jobs | Paris,France