354 Aws Emr Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 3.0 years

14 - 16 Lacs

hyderabad, telangana, india

On-site

Job Description We are looking for a Junior or Associate Software Engineer to join our data engineering team. This role involves building and maintaining robust data pipelines, working with distributed data processing frameworks, and integrating GenAI agents for intelligent monitoring and automation. Key Responsibilities: Design and implement data pipelines using Apache Airflow. Develop, configure, and maintain Apache Spark jobs using Scala or Python. Work with AWS services such as EMR, S3, and RDS to manage data workflows and storage. Build and deploy GenAI agents to monitor job execution, performance metrics, and system health. Collaborate with cross-functional teams to ensure data reliabi...

Posted 1 month ago

AI Match Score
Apply

0.0 years

0 Lacs

india

On-site

About our group: We are a proactive, highly solutions oriented and collaborative team that works with all the various business groups across the organization. Our purpose is capturing massive amounts of data is to transform this vital information into concrete and valuable insights that will allow Seagate to make better and more strategic business decisions. About the role - you will: . Part of 10-12 Platform Engineers that are the crux for developing and maintaining Big Data (Data Lake, Data Warehouse and Data Integration) and advanced analytics platforms at SeaGate . Apply your hands-on subject matter expertise in the Architecture of and administration of Big Data platforms - Data Warehous...

Posted 1 month ago

AI Match Score
Apply

0.0 years

0 Lacs

india

On-site

About our group: We are a proactive, highly solutions oriented and collaborative team that works with all the various business groups across the organization. Our purpose is capturing massive amounts of data is to transform this vital information into concrete and valuable insights that will allow Seagate to make better and more strategic business decisions. About the role - you will: . Part of 10-12 Platform Engineers that are the crux for developing and maintaining Big Data (Data Lake, Data Warehouse and Data Integration) and advanced analytics platforms at SeaGate . Apply your hands-on subject matter expertise in the Architecture of and administration of Big Data platforms - Data Warehous...

Posted 1 month ago

AI Match Score
Apply

3.0 - 5.0 years

30 - 35 Lacs

bengaluru

Work from Office

Design and implement foundational frameworks for ingestion, orchestration, schema validation, and metadata management. Build robust, scalable pipelines for Change Data Capture (CDC) using Debezium integrated with Kafka and Spark. Optimize data serving layers powered by Trino, including metadata syncing, security filtering, and performance tuning. Partner with SRE and Infra teams to build autoscaling, self-healing, and cost-optimized Spark jobs on AWS EMR. Implement observability features logs, metrics, alerts for critical platform services and data pipelines. Define and enforce standards for schema evolution, lineage tracking, and data governance. Automate platform operations using CI/CD pip...

Posted 1 month ago

AI Match Score
Apply

5.0 - 8.0 years

12 - 16 Lacs

pune

Work from Office

Your role and responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the pl...

Posted 1 month ago

AI Match Score
Apply

5.0 - 7.0 years

12 - 16 Lacs

pune

Work from Office

Your role and responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform * Responsibilities * Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS * Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on ...

Posted 1 month ago

AI Match Score
Apply

5.0 - 8.0 years

0 Lacs

pune, maharashtra, india

Remote

Sr. Data Scientist / Machine Learning Engineer Experience: 5-8 years Budget: 140LPM Remote Contract : 6months Shift: IST timings(8 hours per day) Key Responsibilities: Design, develop, and deploy machine learning models into production environments. Work with large, complex datasets to identify patterns, extract insights, and support business decisions. Implement scalable data processing pipelines using big data technologies such as Hadoop and Spark. Automate workflows and manage data pipelines using Airflow and AWS EMR. Collaborate with data engineers, analysts, and other stakeholders to deliver end-to-end data science solutions. Optimize model performance and ensure robustness, scalability...

Posted 1 month ago

AI Match Score
Apply

8.0 - 13.0 years

40 - 50 Lacs

bengaluru, delhi / ncr, mumbai (all areas)

Hybrid

Design, develop new features for a large-scale cloud data platform using Databricks, snowflake and Fabric, Deeply understand legacy data platforms, and drive their modernization, Provide production support Design data pipelines and data warehouse. Required Candidate profile 5 years in developing and scaling data platforms using Snowflake, databricks, Fabric, Strong Python and SQL Intermediate C# Understanding of modern data architectures such as Data Mesh, Lakehouse

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

Role Overview: Join us as a Data Engineer. You'll be the voice of our customers, using data to tell their stories and put them at the heart of all decision-making. We'll look to you to drive the build of effortless, digital first customer experiences. If you're ready for a new challenge and want to make a far-reaching impact through your work, this could be the opportunity you're looking for. This role is offered at the vice president level. Key Responsibilities: - Simplify the organization by developing innovative data-driven solutions through data pipelines, modeling, and ETL design to ensure commercial success while maintaining customer and data security. - Understand complex business pro...

Posted 1 month ago

AI Match Score
Apply

7.0 - 11.0 years

35 - 45 Lacs

jaipur, bengaluru, delhi / ncr

Hybrid

Design and develop integration of Snowflake with a cloud platform and platform enhancements, integrations, and performance optimisation. Work on data ingestion using Python, cataloguing, and lineage tracking, Develop and architect ETL workflows. Required Candidate profile 5 years in developing and scaling data platforms centered around Snowflake, with Azure. Hands on Python Understanding of modern data architectures such as Data Mesh, Lakehouse, and ELT.

Posted 1 month ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Senior Cloud Data Developer, you will be responsible for bridging the gap between data engineering and cloud-native application development. Your role will involve utilizing your strong programming skills and data engineering expertise to build and maintain scalable data solutions in the cloud. Core Responsibilities: - Design and develop data-centric applications using Java Spring Boot and AWS services - Create and maintain scalable ETL pipelines using AWS EMR and Apache NiFi - Implement data workflows and orchestration using AWS MWAA (Managed Workflows for Apache Airflow) - Build real-time data processing solutions using AWS SNS/SQS and AWS Pipes - Develop and optimize data storage sol...

Posted 1 month ago

AI Match Score
Apply

1.0 - 3.0 years

2 - 5 Lacs

mumbai, bengaluru, delhi / ncr

Work from Office

We are seeking a skilled Data Engineer with strong experience in SQL, PySpark, and cloud technologies to join our dynamic team. The ideal candidate will have a solid background in designing and implementing data engineering pipelines, with a focus on performance, scalability, and reliability. You will work closely with other data engineers, data scientists, and stakeholders to develop, maintain, and optimize data infrastructure. Key Responsibilities : SQL Development : - Write medium-complexity SQL queries to extract, transform, and load data efficiently. - Optimize SQL queries for performance and scalability. - Collaborate with team members to understand data requirements and translate them...

Posted 1 month ago

AI Match Score
Apply

4.0 - 9.0 years

15 - 25 Lacs

pune

Work from Office

AIA-Pune Job Summary We are seeking a skilled Developer with 4 to 9 years of experience to join our team in a hybrid work model. The ideal candidate will have expertise in Amazon RDS AWS Glue AWS DevOps and other AWS technologies. This role involves working with big data and requires proficiency in Python and Apache Spark. Experience in regulatory and process control is a plus. Responsibilities Develop and maintain scalable data processing systems using Amazon RDS AWS Glue and AWS DevOps to ensure efficient data management and processing. Collaborate with cross-functional teams to design and implement data solutions that meet business requirements and enhance operational efficiency. Utilize ...

Posted 1 month ago

AI Match Score
Apply

2.0 - 6.0 years

12 - 16 Lacs

bengaluru

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs.Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor...

Posted 1 month ago

AI Match Score
Apply

2.0 - 6.0 years

12 - 16 Lacs

bengaluru

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs.Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor...

Posted 1 month ago

AI Match Score
Apply

8.0 - 10.0 years

9 - 13 Lacs

noida

Work from Office

AWS data developers with 8-10 years experience certified candidates (AWS data engineer associate or AWS solution architect) are preferred Skills required - SQL, AWS Glue, PySpark, Air Flow, CDK, Red shift Good communication skills and can deliver independently Mandatory Competencies Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Beh - Communication Big Data - Big Data - Pyspark Database - Database Programming - SQL Programming Language - Python - Apache Airflow.

Posted 1 month ago

AI Match Score
Apply

6.0 - 10.0 years

5 - 9 Lacs

noida

Work from Office

AWS data developers with 6-10 years experience certified candidates (AWS data engineer associate or AWS solution architect) are preferred Skills required - SQL, AWS Glue, PySpark, Air Flow, CDK, Red shift Good communication skills and can deliver independently. Mandatory Competencies Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Big Data - Big Data - Pyspark Beh - Communication Database - Database Programming - SQL.

Posted 1 month ago

AI Match Score
Apply

8.0 - 10.0 years

9 - 13 Lacs

noida

Work from Office

AWS data developers with 8-10 years experience certified candidates (AWS data engineer associate or AWS solution architect) are preferred Skills required - SQL, AWS Glue, PySpark, Air Flow, CDK, Red shift Good communication skills and can deliver independently Mandatory Competencies Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Beh - Communication Big Data - Big Data - Pyspark Database - Database Programming - SQL Programming Language - Python - Apache Airflow

Posted 1 month ago

AI Match Score
Apply

1.0 - 4.0 years

3 - 6 Lacs

noida

Work from Office

Contract Duration : 3 months Skills : - PySpark or Scala with Spark, Spark Architecture, Hadoop, SQL - Streaming Technologies like Kafka etc.- Proficiency in Advanced SQL (Window functions)- Airflow, S3, and Stream Sets or similar ETL tools.- Basic Knowledge on AWS IAM, AWS EMR and Snowflake. Responsibilities : - Data Pipeline Development: Design, implement, and maintain scalable and efficient data pipelines to collect, process, and store large volumes of structured and unstructured data. - Data Modeling: Develop and maintain data models, schemas, and metadata to support the organization's data initiatives. Ensure data integrity and optimize data storage and retrieval processes. - Data Integ...

Posted 1 month ago

AI Match Score
Apply

6.0 - 10.0 years

5 - 9 Lacs

noida

Work from Office

AWS data developers with 6-10 years experience certified candidates (AWS data engineer associate or AWS solution architect) are preferred Skills required - SQL, AWS Glue, PySpark, Air Flow, CDK, Red shift Good communication skills and can deliver independently Mandatory Competencies Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Big Data - Big Data - Pyspark Beh - Communication Database - Database Programming - SQL

Posted 1 month ago

AI Match Score
Apply

8.0 - 10.0 years

9 - 13 Lacs

noida

Work from Office

AWS data developers with 8-10 years experience certified candidates (AWS data engineer associate or AWS solution architect) are preferred Skills required - SQL, AWS Glue, PySpark, Air Flow, CDK, Red shift Good communication skills and can deliver independently Mandatory Competencies Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Beh - Communication Big Data - Big Data - Pyspark Database - Database Programming - SQL Programming Language - Python - Apache Airflow

Posted 1 month ago

AI Match Score
Apply

6.0 - 10.0 years

5 - 9 Lacs

noida

Work from Office

AWS data developers with 6-10 years experience certified candidates (AWS data engineer associate or AWS solution architect) are preferred Skills required - SQL, AWS Glue, PySpark, Air Flow, CDK, Red shift Good communication skills and can deliver independently Mandatory Competencies Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Big Data - Big Data - Pyspark Beh - Communication Database - Database Programming - SQL

Posted 1 month ago

AI Match Score
Apply

8.0 - 10.0 years

9 - 13 Lacs

noida

Work from Office

AWS data developers with 8-10 years experience certified candidates (AWS data engineer associate or AWS solution architect) are preferred Skills required - SQL, AWS Glue, PySpark, Air Flow, CDK, Red shift Good communication skills and can deliver independently Mandatory Competencies Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Beh - Communication Big Data - Big Data - Pyspark Database - Database Programming - SQL Programming Language - Python - Apache Airflow

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Python Developer, you will be joining a team working with a renowned financial institution to deliver business value through your expertise. Your role will involve analyzing existing SAS DI pipelines and SQL-based transformations, translating and optimizing SAS SQL logic into Python code using frameworks like Pyspark, developing scalable ETL pipelines on AWS EMR, implementing data transformation and aggregation logic, designing modular code for distributed data processing tasks, integrating EMR jobs with various systems, and developing Tableau reports for business reporting. Key Responsibilities: - Analyze existing SAS DI pipelines and SQL-based transformations. - Translate and optimize...

Posted 1 month ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a skilled and proactive Python / PySpark Developer at our company, you will join our data engineering or analytics team. Your primary responsibility will be to build scalable data pipelines, perform large-scale data processing, and collaborate with data scientists, analysts, and business stakeholders. Key Responsibilities: - Design, develop, and optimize ETL data pipelines using PySpark on big data platforms (e.g., Hadoop, Databricks, EMR). - Write clean, efficient, and modular code in Python for data processing and integration tasks. - Work with large datasets to extract insights, transform raw data, and ensure data quality. - Collaborate with cross-functional teams to understand busines...

Posted 1 month ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies