1847 Aws Glue Jobs - Page 36

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

15 - 30 Lacs

gurugram

Work from Office

Role Description As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining secur...

Posted 3 months ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Software Engineer III at JPMorgan Chase within the Corporate Technology, your role involves serving as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable manner. Your responsibilities include executing software solutions, design, development, and technical troubleshooting with the ability to think innovatively, creating secure and high-quality production code, producing architecture and design artifacts for complex applications, and contributing to software engineering communities of practice and events exploring new technologies. You are also expected to proactively identify hidden problems and patterns ...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

As a Sr/Lead Data Engineer at Ccube in Indore, MP (Hybrid Model), you will have the opportunity to work with large and complex sets of data to meet both non-functional and functional business requirements. Your key responsibilities will include: - Gathering and assembling data using skills such as SQL, Python, R, Data Modeling, Data Warehousing, and AWS (S3, Athena) - Creating new data pipelines or enhancing existing pipelines with tools like ETL Tools (e.g., Apache NiFi, Talend), Python (Pandas, PySpark), AWS Glue, JSON, XML, YAML - Identifying, designing, and implementing process improvements with tools like Apache Airflow, Terraform, Kubernetes, AWS Lambda, CI/CD pipelines, Docker - Build...

Posted 3 months ago

AI Match Score
Apply

3.0 - 8.0 years

0 Lacs

coimbatore, tamil nadu

On-site

Role Overview: At Techjays, we are on a mission to empower businesses worldwide by building AI solutions that drive industry transformation. We are seeking a skilled Data Analytics Engineer with 3 to 8 years of experience to join our global team in Coimbatore. As a Data Analytics Engineer at Techjays, you will play a crucial role in designing and delivering scalable, data-driven solutions that support real-time decision-making and deep business insights. Your primary focus will be on developing and maintaining ETL/ELT data pipelines, collaborating with various teams, designing interactive dashboards, and ensuring high reliability of data pipelines. If you are passionate about using tools lik...

Posted 3 months ago

AI Match Score
Apply

12.0 - 16.0 years

0 Lacs

hyderabad, telangana

On-site

As a SAP Analytics Leader at Decimal Business Solutions, you will be responsible for driving IT Business Transformations with a focus on large-scale SAC Planning, SAP Datasphere, SAP Business AI, and various cloud solutions such as AWS, Azure, RISE with SAP, and S/4HANA Cloud. Your expertise across different business processes like Finance, Enterprise Planning, HR, Supply Chain, and Sales will be crucial in this role. Your key responsibilities will include: - Leading 3+ End to End Green Field SAC Planning Implementations with S/4HANA - Thought leadership in the SAP Analytics area - Hands-on experience in FP&A, Analytics & Data Science Solutioning and Consulting - Extensive experience in RFP ...

Posted 3 months ago

AI Match Score
Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a Senior Data Engineer at our company, you will be responsible for contributing to the development and maintenance of scalable data solutions. Your expertise in SQL, Snowflake, and AWS-based ETL tools will be crucial in ensuring the high performance of our data systems. **Key Responsibilities:** - Building and deploying reliable ETL/ELT workflows - Designing large-scale data pipelines and automated audit processes - Extracting data from diverse sources such as flat files, XML, big data appliances, and RDBMS - Collaborating across teams in Agile environments - Writing and updating technical documentation **Qualifications Required:** - 7+ years of relevant experience - Strong hands-on exper...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a data infrastructure expert at MPOWER, you will be responsible for designing, building, and maintaining data pipelines in Python and AWS to support the company's business decisions. Your key responsibilities will include: - Designing, building, and maintaining data pipelines in Python and AWS, utilizing services like AWS Redshift, AWS Glue, and MWAA. - Ensuring data quality and integrity through validation, cleansing, and error handling techniques. - Monitoring, troubleshooting, and optimizing the performance of data pipelines for reliability and scalability. - Establishing user needs, monitoring access and security, and capacity planning for database storage requirements. - Building dat...

Posted 3 months ago

AI Match Score
Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Data Platform Engineer at Zywave, you will play a crucial role in designing, developing, and optimizing our enterprise data platform. Your expertise in Snowflake, ELT pipelines, DBT, and Azure Data Factory will be essential in driving data-driven decision-making across the organization. **Key Responsibilities:** - Design and implement scalable ELT pipelines using DBT and Azure Data Factory for data ingestion, transformation, and loading into Snowflake. - Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver robust data models. - Optimize Snowflake performance through clustering, partitioning, and query tuning. - Dev...

Posted 3 months ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As an Associate at PwC, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. Your responsibilities at this management level include but are not limited to: - Use feedback and reflection to develop self-awareness, personal strengths, and address development areas - Be flexible to work in stretch opportunities/assignments - Demonstrate critical thinking and the ability to bring order to unstructured problems - Conduct Ticket Quality and deliverables review, Status Reporting for the project - Adhere to SLAs, have experience in incident management, change management and problem management - Seek and embrace opportunities that pr...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a highly skilled and visionary Senior Integration & Data Engineering Lead, you will be responsible for architecting and delivering cloud-native, API-first solutions to drive enterprise-scale digital transformation. Your deep expertise in Boomi, event-driven architectures, Kafka, AWS, AWS Glue, modern ETL tools like Informatica, Matillion, or IICS, Snowflake, and Python will be key to success in this role. **Key Responsibilities:** - Design and Lead the development of API-first, modular integration solutions using Boomi and Kafka. - Design and implement scalable, cloud-native data pipelines using AWS Glue, Snowflake, and modern ETL platforms. - Collaborate with enterprise architects and bu...

Posted 3 months ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Role Overview: As a Software Engineer III at JPMorgan Chase within the Corporate Technology, you will be a seasoned member of an agile team responsible for designing and delivering trusted market-leading technology products in a secure, stable, and scalable manner. Your role involves executing software solutions, development, and technical troubleshooting while maintaining high-quality production code and algorithms. You will contribute to the architecture and design of complex applications, ensuring design constraints are met by software code development. Additionally, you will analyze and synthesize data sets to drive continuous improvement of software applications and systems. Your proact...

Posted 3 months ago

AI Match Score
Apply

8.0 - 12.0 years

0 - 2 Lacs

pune

Hybrid

Technology Lead ETL & Data Engineering (Fandango) Location: Pune / Hybrid Experience: 8 to 12 Years Employment Type: Full-Time Job Summary We are seeking a highly skilled and experienced Technology Lead with 10+ years of expertise in ETL, Data Warehousing, and AWS cloud services. The ideal candidate should have strong hands-on experience with Talend (Data Integration, Big Data, Admin), AWS Glue, PySpark, Airflow (MWAA preferred), and AWS Redshift. You will play a critical role in designing, developing, and leading data integration pipelines, ensuring scalability, performance, and business impact. Key Responsibilities Lead the design, development, and deployment of end-to-end ETL and Data War...

Posted 3 months ago

AI Match Score
Apply

7.0 - 12.0 years

0 - 0 Lacs

hyderabad, chennai, bengaluru

Hybrid

Primary Skills: Python (Numpy / Pandas), AWS (S3, Lambda/glue), SQL Experience: 7+ years Location: Bangalore, Hyderabad, Chennai (Hybrid mode) Notice Period: Immediate 30 days preferred Interview Mode: Virtual

Posted 3 months ago

AI Match Score
Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are an experienced AWS Data Engineer (Glue Developer) with 6-8 years of experience, specializing in building and managing ETL pipelines using AWS Glue, Python, and SQL. Your expertise in data modeling, hands-on knowledge of AWS services, and experience in working with multi-account AWS environments are key to succeeding in this role. Key Responsibilities: - Design, develop, and maintain robust, scalable ETL pipelines using AWS Glue (Jobs, Crawlers, Workflows). - Write efficient, production-grade Python and SQL code for data extraction, transformation, and loading. - Build and optimize data models for analytical and operational workloads. - Integrate and manage data ingestion from multipl...

Posted 3 months ago

AI Match Score
Apply

5.0 - 10.0 years

2 - 7 Lacs

noida, gurugram, delhi / ncr

Work from Office

ETL / Pipeline Developer(2) Location-Gurgaon(Onsite) Experience: Minimum 5+ years Immediate joiners Key Responsibilities: Develop, maintain, and optimize ETL processes and data pipelines. Extract, transform, and load data from multiple sources into data warehouse/data lake. Implement data validation, cleansing, and transformation logic. Work with architects to align pipeline development with overall solution design. Monitor performance and troubleshoot ETL processes to ensure smooth data flow. Document ETL workflows and support handover to operations teams. Required Skills: Strong hands-on experience with ETL tools (Talend and similar). Proficiency in SQL, stored procedures, and scripting (P...

Posted 3 months ago

AI Match Score
Apply

6.0 - 9.0 years

5 - 14 Lacs

kolkata, hyderabad, pune

Work from Office

Hi, This is Riddhi from Silverlink Technologies. We have an excellent Job opportunity with TCS for the post of Data Engineer(AWS, Databricks) " at Bengaluru, Pune, Hyderabad and Kolkata Location. If interested, kindly forward me your word formatted updated resume ASAP on Riddhi@silverlinktechnologies.com, kindly fill in the below-mentioned details too. Full Name: Contact No: Email ID: DOB: Experience: Relevant Exp: Current Company: Notice Period: Current CTC: Expected CTC: Offer in hand: If yes then offered ctc: Date of joining: Company name: Grades -- 10th: 12th: Graduation: Full time/Part Time? University Name: Current Location: Preferred Location: Gap in education: Gap in employment: **Ma...

Posted 3 months ago

AI Match Score
Apply

4.0 - 9.0 years

6 - 10 Lacs

bengaluru

Work from Office

We are looking for a skilled AWS Data Pipeline Engineer with 4 to 12 years of experience to manage data storage solutions on AWS, implement data processing workflows, and collaborate with cross-functional teams. The ideal candidate will have a strong understanding of core AWS services, cloud concepts, and the AWS Well-Architected Framework. Roles and Responsibility Managing data storage solutions on AWS, including Amazon S3, Amazon Redshift, and Amazon DynamoDB. Implementing and optimizing data processing workflows using AWS services like AWS Glue, Amazon EMR, and AWS Lambda. Collaborating with Spotfire Engineers and business analysts to ensure data accessibility and usability for analysis a...

Posted 3 months ago

AI Match Score
Apply

6.0 - 8.0 years

3 - 6 Lacs

hyderabad

Work from Office

Looking to onboard a skilled professional with 6-8 years of experience in Pyspark, SQL, and AWS Glue. The ideal candidate will have a strong background in these technologies and excellent problem-solving skills. This position is located across Pan India. Roles and Responsibility Design, develop, and implement data processing pipelines using Pyspark and AWS Glue. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data warehouses using Snowflake and Snowpark. Optimize data processing workflows for performance and scalability. Troubleshoot and resolve technical issues related to data processing and storage. Ensure data quali...

Posted 3 months ago

AI Match Score
Apply

5.0 - 10.0 years

2 - 6 Lacs

chennai

Work from Office

Responsibilities: Design, develop, and deploy scalable, secure applications using AWS cloud-native technologies. Use AWS services like Lambda, Kinesis, Redshift, and API Gateway to build high-performance solutions. Architect serverless applications using AWS Lambda for event-driven computing. Build and maintain real-time or batched data pipelines Design and optimize data warehouses with Amazon Redshift for large-scale data storage and analysis. Implement event-driven architectures for efficient communication between various AWS services. Write and optimize PySpark code for the data processes to transform and load data. Ensure security and compliance with AWS best practices, including VPC, IA...

Posted 3 months ago

AI Match Score
Apply

5.0 - 10.0 years

2 - 6 Lacs

bengaluru

Work from Office

Responsibilities: Design, develop, and deploy scalable, secure applications using AWS cloud-native technologies. Use AWS services like Lambda, Kinesis, Redshift, and API Gateway to build high-performance solutions. Architect serverless applications using AWS Lambda for event-driven computing. Build and maintain real-time or batched data pipelines Design and optimize data warehouses with Amazon Redshift for large-scale data storage and analysis. Implement event-driven architectures for efficient communication between various AWS services. Write and optimize PySpark code for the data processes to transform and load data. Ensure security and compliance with AWS best practices, including VPC, IA...

Posted 3 months ago

AI Match Score
Apply

5.0 - 10.0 years

4 - 7 Lacs

mumbai

Hybrid

PF Detection is mandatory Job Description: Minimum 5 years of experience in database development and ETL tools. Strong expertise in SQL and database platforms (e.g. SQL Server Oracle PostgreSQL). Proficiency in ETL tools (e.g. Informatica SSIS Talend DataStage) and scripting languages (e.g. Python Shell). Experience with data modeling and schema design. Familiarity with cloud databases and ETL tools (e.g. AWS Glue Azure Data Factory Snowflake). Understanding of data warehousing concepts and best practices

Posted 3 months ago

AI Match Score
Apply

6.0 - 11.0 years

2 - 5 Lacs

kolkata, hyderabad, peth

Work from Office

Pyspark SparkSQL SQL and Glue. ii. AWS cloud experience iii. Good understanding of dimensional modelling iv. Good understanding DevOps CloudOps DataOps CI/CD & with a SRE mindset v. Understanding of Lakehouse and DW architecture vi. strong analysis and analytical skills vii. understanding of version control systems specifically Git viii. strong in software engineering APIs Microservices etc. Soft skills i. written and oral communication skills ii. ability to translate business needs to system. Location - Hyderabad,Kolkata,Peth,Pune

Posted 3 months ago

AI Match Score
Apply

5.0 - 10.0 years

11 - 15 Lacs

hyderabad

Work from Office

Stellantis is seeking a passionate, innovative, results-oriented Information Communication Technology (ICT) Manufacturing AWS Cloud Architect to join the team. As a Cloud architect, the selected candidate will leverage business analysis, data management, and data engineering skills to develop sustainable data tools supporting Stellantiss Manufacturing Portfolio Planning. This role will collaborate closely with data analysts and business intelligence developers within the Product Development IT Data Insights team. Job responsibilities include but are not limited to: Having deep expertise in the design, creation, management, and business use of large datasets, across a variety of data platform...

Posted 3 months ago

AI Match Score
Apply

5.0 - 7.0 years

12 - 16 Lacs

bengaluru

Work from Office

Key Responsibilities Design, build, and optimize scalable data pipelines on AWS. Develop and maintain data models to support analytics and business intelligence. Implement orchestration workflows using Airflow or AWS Step Functions . Work with structured and semi-structured data across S3, Redshift, Glue, Lambda . Write efficient, reusable, and modular Python and SQL code for ETL processes. Ensure data reliability, scalability, and performance tuning of pipelines. Integrate third-party and internal APIs into data workflows. Collaborate with stakeholders to translate business requirements into technical solutions. Required Skillset Strong expertise in AWS architecture & optimization (S3, Reds...

Posted 3 months ago

AI Match Score
Apply

8.0 - 11.0 years

15 - 27 Lacs

noida, mumbai, pune

Hybrid

About the Role We are seeking a highly skilled Lead Data Engineer to define and drive the data migration strategy from legacy RDBMS platforms to PostgreSQL for a mission-critical billing and invoicing system. This role requires a hands-on technical leader who can design the migration architecture, oversee implementation, and ensure seamless transition without disruption to core business functions. The ideal candidate will have deep expertise in data migration frameworks, large-scale distributed processing (Spark preferred), and experience with both open-source and AWS ecosystem tools. They will work closely with business stakeholders and customers to manage expectations, deliver with precisi...

Posted 3 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies