1847 Aws Glue Jobs - Page 34

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

17 - 32 Lacs

hyderabad, pune, bengaluru

Work from Office

Job Description: Strong expertise in Python (including data processing/manipulation libraries such as pandas, PySpark). Hands-on experience in designing and maintaining ETL pipelines. Proficiency with AWS cloud services (Glue, Lambda, EMR, ECS, Lake Formation, etc.). Experience with orchestration tools such as Airflow and AWS Step Functions. Exposure to databases like Redshift, Aurora, Postgres, or Snowflake. Experience with Reshift is preferred.

Posted 2 months ago

AI Match Score
Apply

5.0 - 8.0 years

11 - 16 Lacs

chennai

Work from Office

Role Overview: We are looking for a skilled Backend Developer with strong Python expertise and experience in building scalable microservices. The candidate should be comfortable working in AWS environments and collaborating with cross-functional teams. Key Responsibilities: - Develop and maintain microservices-based backend systems. - Design and implement efficient APIs. - Optimize database queries and schema design. - Support production systems and troubleshoot issues. - Collaborate with team members on code reviews and development practices. - Work with GitHub workflows and CI/CD pipelines. - Integrate with ElasticSearch where needed. - Utilize AWS services including AWS Glue for ETL and d...

Posted 2 months ago

AI Match Score
Apply

2.0 - 3.0 years

5 - 5 Lacs

kochi, chennai, thiruvananthapuram

Work from Office

Role Proficiency: This role requires proficiency in data pipeline development including coding testing and implementing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc along with coding skills in Python PySpark and SQL. Works independently according to work allocation. Outcomes: Operate with minimal guidance to develop error-free code test applications and document the development process. Understand application features and component designs to develop them in accordance with user stories and requirements. Code debug test document and communicate the stages of product...

Posted 2 months ago

AI Match Score
Apply

2.0 - 3.0 years

5 - 5 Lacs

kochi, chennai, thiruvananthapuram

Work from Office

Role Proficiency: This role requires proficiency in data pipeline development including coding testing and implementing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc along with coding skills in Python PySpark and SQL. Works independently according to work allocation. Outcomes: Operate with minimal guidance to develop error-free code test applications and document the development process. Understand application features and component designs to develop them in accordance with user stories and requirements. Code debug test document and communicate the stages of product...

Posted 2 months ago

AI Match Score
Apply

2.0 - 3.0 years

5 - 5 Lacs

thiruvananthapuram

Work from Office

Role Proficiency: This role requires proficiency in data pipeline development including coding testing and implementing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc along with coding skills in Python PySpark and SQL. Works independently according to work allocation. Outcomes: Operate with minimal guidance to develop error-free code test applications and document the development process. Understand application features and component designs to develop them in accordance with user stories and requirements. Code debug test document and communicate the stages of product...

Posted 2 months ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be part of a team working on challenging issues in financial services and technology at FIS. The team at FIS is described as open, collaborative, entrepreneurial, passionate, and fun. In the competitive private equity market, organizations are under pressure to deliver superior returns and meet stringent reporting requirements. Your role will involve developing core versions of software applications for sale to external clients, identifying client purchasing requirements, and technical specifications. You will also interact with engineering groups to assist in design changes and train clients on systems application. The shift timing for this role is from 4 PM to 1 AM. - Proficiency ...

Posted 2 months ago

AI Match Score
Apply

5.0 - 12.0 years

0 Lacs

kolkata, west bengal

On-site

Role Overview: You will be responsible for handling data services on AWS, where your expertise in various AWS services such as AWS Glue, AWS Lambda, AWS RDS, AWS S3, Dynamo DB, and PySpark will be crucial. Key Responsibilities: - Utilize AWS Glue for ETL processes and data preparation tasks - Implement serverless functions using AWS Lambda - Manage databases on AWS RDS - Handle data storage and retrieval using AWS S3 - Work with Dynamo DB for NoSQL database requirements - Develop data processing applications using PySpark Qualifications Required: - Proficiency in AWS services like AWS Glue, AWS Lambda, AWS RDS, AWS S3, Dynamo DB - Experience with PySpark for data processing - 5 to 12 years o...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Lead Data AI/ML Engineer - Vice President at Barclays, you will spearhead the evolution of the Corporate Data Services function. Your role will involve effective stakeholder management, leadership, and decision-making to support business strategy and risk management. **Key Responsibilities:** - Participating in daily stand-up meetings to discuss progress, blockers, and plans for the day. - Working on deploying AI/ML models into production environments using AWS services like Glue, Lambda, and Step Functions. - Building and maintaining data pipelines using Spark and Python to ensure seamless data flow and integration. - Collaborating with data scientists, software engineers, and other st...

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

6 - 15 Lacs

pune

Hybrid

Role & responsibilities Around 6+ years of experience in data engineering or cloud computing data development Design, develop, and optimize ETL/ELT data pipelines using Apache Spark (PySpark or Scala), AWS Glue, and Azure Data Factory Work with structured and unstructured data to build scalable ingestion and transformation workflows across cloud platforms. Build data lake and data warehouse solutions using AWS S3, Azure Data Lake Collaborate with data scientists, analysts, and application developers to support advanced analytics, reporting, and ML workflows Implement job orchestration, monitoring, and error handling for reliable pipeline execution. Maintenance and support of Data Pipeline. D...

Posted 2 months ago

AI Match Score
Apply

2.0 - 6.0 years

12 - 16 Lacs

kochi

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developin...

Posted 2 months ago

AI Match Score
Apply

2.0 - 4.0 years

4 - 8 Lacs

pune

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : AWS Architecture Good to have skills : Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quali...

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

6 - 11 Lacs

gurugram

Work from Office

Role Description The role involves leading the design and management of data solutions in a cloud environment with a focus on AWS services. Responsibilities: Lead the design, implementation, and optimization of scalable data pipelines and architectures utilizing AWS Glue, Elastic MapReduce (EMR), Lambda, Redshift, Athena, DynamoDB, OpenSearch, and S3 Use Spark on AWS for data transformation and processing across large datasets Develop and maintain efficient data workflows with SQS for task queueing and orchestration Integrate, transform, and manage data using Mulesoft and Talend for seamless data integration Ensure high-performance data storage, retrieval, and analytics across Redshift, Dyna...

Posted 2 months ago

AI Match Score
Apply

7.0 - 9.0 years

11 - 15 Lacs

gurugram

Work from Office

Role Description: As a Technical Lead - Cloud Data Platform (AWS) at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining securit...

Posted 2 months ago

AI Match Score
Apply

4.0 - 6.0 years

7 - 12 Lacs

hyderabad

Work from Office

Role Description: As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining secu...

Posted 2 months ago

AI Match Score
Apply

4.0 - 6.0 years

7 - 11 Lacs

gurugram

Work from Office

Role Description As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining secur...

Posted 2 months ago

AI Match Score
Apply

7.0 - 9.0 years

11 - 15 Lacs

chennai

Work from Office

Role Description: As a Technical Lead - Cloud Data Platform (AWS) at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining secur...

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

6 - 11 Lacs

gurugram

Work from Office

The role involves leading the design and management of data solutions in a cloud environment with a focus on AWS services. Responsibilities: Lead the design, implementation, and optimization of scalable data pipelines and architectures utilizing AWS Glue, Elastic MapReduce (EMR), Lambda, Redshift, Athena, DynamoDB, OpenSearch, and S3. Use Spark on AWS for data transformation and processing across large datasets. Develop and maintain efficient data workflows with SQS for task queueing and orchestration. Integrate, transform, and manage data using Mulesoft and Talend for seamless data integration. Ensure high-performance data storage, retrieval, and analytics across Redshift, DynamoDB, and Ath...

Posted 2 months ago

AI Match Score
Apply

11.0 - 12.0 years

15 - 20 Lacs

gurugram

Work from Office

Role Description :As a Principle Engineer - Cloud Data Platform (AWS) at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining s...

Posted 2 months ago

AI Match Score
Apply

7.0 - 9.0 years

11 - 15 Lacs

chennai

Work from Office

Role Description As a Technical Lead - Cloud Data Platform (AWS) at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining securi...

Posted 2 months ago

AI Match Score
Apply

3.0 - 7.0 years

4 - 9 Lacs

noida, hyderabad, pune

Work from Office

DATA ENGINEER (Databricks & AWS) Overview: As a Data Engineer, you will work with multiple teams to deliver solutions on the AWS Cloud using core cloud data engineering tools such as Databricks on AWS, AWS Glue, Amazon Redshift, Athena, and other Big Data-related technologies. This role focuses on building the next generation of application-level data platforms and improving recent implementations. Hands-on experience with Apache Spark (PySpark, SparkSQL), Delta Lake, Iceberg, and Databricks is essential. Responsibilities : • Define, design, develop, and test software components/applications using AWS-native data services: Databricks on AWS, AWS Glue, Amazon S3, Amazon Redshift, Athena, AWS ...

Posted 3 months ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

Role Overview: At PwC, your role as an Associate in the Tower of Data, Analytics & Specialist Managed Service will involve working as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. Your responsibilities at this management level will include but are not limited to: Key Responsibilities: - Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. - Be flexible to work in stretch opportunities/assignments. - Demonstrate critical thinking and the ability to bring order to unstructured problems. - Review Ticket Quality and deliverables, provide Status Reporting for the project. - Adhere to SLA...

Posted 3 months ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

Role Overview: You will be responsible for migrating SSIS packages to AWS Glue, focusing on ETL processes and cloud computing. This role will involve automating the migration using tools like AWS Schema Conversion Tool (AWS SCT) and developing custom connectors. Key Responsibilities: - Plan and analyze the migration process - Create AWS Glue jobs - Develop custom connectors - Perform data transformation and validation - Monitor and maintain the migrated packages Qualifications Required: - Strong knowledge of AWS Glue - Hands-on experience with AWS Cloud - Proficiency in ETL Concepts - Familiarity with SSIS - Scripting and programming skills - Additional skills in AWS Redshift and SQL profici...

Posted 3 months ago

AI Match Score
Apply

6.0 - 10.0 years

18 - 25 Lacs

hyderabad, bengaluru

Work from Office

Role: AWS + PySpark Data Engineer Experience: 4-8 yrs Must Have Skills: 4 Years Relevant in AWS PySpark distributed data processing, job optimization, troubleshooting. AWS Data Lake S3, Glue, Athena. Redshift – SQL queries, integration with PySpark. Strong SQL (advanced). Python coding for data manipulation & pipelines. Good to Have: SAP S/4HANA Data Extraction (OData, RFC, connectors). Pandas/NumPy for Python-based data transformation.

Posted 3 months ago

AI Match Score
Apply

4.0 - 8.0 years

6 - 11 Lacs

gurugram

Work from Office

The role involves leading the design and management of data solutions in a cloud environment with a focus on AWS services. Responsibilities: Lead the design, implementation, and optimization of scalable data pipelines and architectures utilizing AWS Glue, Elastic MapReduce (EMR), Lambda, Redshift, Athena, DynamoDB, OpenSearch, and S3. Use Spark on AWS for data transformation and processing across large datasets. Develop and maintain efficient data workflows with SQS for task queueing and orchestration. Integrate, transform, and manage data using Mulesoft and Talend for seamless data integration. Ensure high-performance data storage, retrieval, and analytics across Redshift, DynamoDB, and Ath...

Posted 3 months ago

AI Match Score
Apply

14.0 - 19.0 years

30 - 32 Lacs

gurugram, bengaluru

Work from Office

Join us as a Data Engineer Youll be the voice of our customers, using data to tell their stories and put them at the heart of all decision-making Well look to you to drive the build of effortless, digital first customer experiences If youre ready for a new challenge and want to make a far-reaching impact through your work, this could be the opportunity youre looking for We're offering this role at vice president level What you'll do As a Data Engineer, youll be looking to simplify our organisation by developing innovative data driven solutions through data pipelines, modelling and ETL design, inspiring to be commercially successful while keeping our customers, and the banks data, safe and se...

Posted 3 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies