1333 Aws Glue Jobs - Page 40

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

15 - 30 Lacs

Gurugram

Hybrid

Job Description We are seeking a highly skilled Senior Data Engineer with deep expertise in AWS data services, data wrangling using Python & PySpark, and a solid understanding of data governance, lineage, and quality frameworks. The ideal candidate will have a proven track record of delivering end-to-end data pipelines for logistics, supply chain, enterprise finance, or B2B analytics use cases. Role & responsibilities Design, build, and optimize ETL pipelines using AWS Glue 3.0+ and PySpark. Implement scalable and secure data lakes using Amazon S3, following bronze/silver/gold zoning. Write performant SQL using AWS Athena (Presto) with CTEs, window functions, and aggregations. Take full owne...

Posted 4 months ago

AI Match Score
Apply

7.0 - 12.0 years

1 - 2 Lacs

Hyderabad

Remote

Role & responsibilities We are looking for a highly experienced Senior Cloud Data Engineer to lead the design, development, and optimization of our cloud-based data infrastructure. This role requires deep technical expertise in AWS services, data engineering best practices, and infrastructure automation. You will be instrumental in shaping our data architecture and enabling data-driven decision-making across the organization. Key Responsibilities: Design, build, and maintain scalable and secure data pipelines using AWS Glue , Redshift , and Python . Develop and optimize SQL queries and stored procedures for complex data transformations and migrations. Automate infrastructure provisioning and...

Posted 4 months ago

AI Match Score
Apply

3.0 - 8.0 years

6 - 16 Lacs

Noida

Remote

Job Title : Freelancer Informatica to AWS Glue Migration Employment Type : Freelancer / Contractual (Remote) Location : Remote Job Description: We are seeking a skilled freelance Data Engineer / ETL Developer to support the migration of ETL pipelines from Informatica to AWS Glue . Responsibilities : Analyze existing Informatica workflows and mappings. Design and implement equivalent AWS Glue jobs using PySpark or Glue Studio. Migrate data transformation logic to AWS Glue and integrate with services like S3, Redshift, Lambda, and CloudWatch. Ensure performance optimization and data quality. Deliver documentation and provide post-migration support. Key Skills : Informatica PowerCenter / Cloud....

Posted 4 months ago

AI Match Score
Apply

7.0 - 12.0 years

30 - 40 Lacs

Hyderabad

Work from Office

Support enhancements to the MDM platform Develop pipelines using snowflake python SQL and airflow Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years of hands on expert level Snowflake, Python, orchestration tools like Airflow Good understanding of investment domain Experience with dbt, Cloud experience (AWS, Azure) DevOps

Posted 4 months ago

AI Match Score
Apply

4.0 - 9.0 years

15 - 25 Lacs

Hyderabad

Work from Office

python, experienced in performing ETL andData Engineering concepts (PySpark, NumPy, Pandas, AWS Glue and Airflow)SQL exclusively Oracle hands-on work experience SQL profilers / Query Analyzers AWS cloud related (S3, RDS, RedShift) ETLPython

Posted 4 months ago

AI Match Score
Apply

4.0 - 9.0 years

8 - 16 Lacs

Kolkata

Remote

Enhance/modify applications, configure existing systems and provide user support DotNET Full Stack or [Angular 18 + developer + DotNET backend] SQL Server Angular version 18+ (it is nice to have) Angular version 15+ (mandatory)

Posted 4 months ago

AI Match Score
Apply

7.0 - 12.0 years

25 - 40 Lacs

Bengaluru

Hybrid

Role - Data Engineer Experience - 7+ Years Notice - Immediate Skills - AWS (S3, Glue, Lambda, EC2), Spark, Pyspark, Python, Airflow

Posted 4 months ago

AI Match Score
Apply

2.0 - 7.0 years

20 - 35 Lacs

Pune

Remote

As a software engineer focused on Marketing and Customer Engagement at GoDaddy, you will have the opportunity to design, build, and maintain a platform that is a keystone to our customer experience, marketing, and business objectives. Everything we do starts with data. Ensure our team continues with a Shift Left” focus on security. This includes the design and development of systems that can contain sensitive customer information. You will partner closely and collaborate with other GoDaddy teams of Engineers, Marketing Professionals, QA and Operations teams. Leverage industry best practices and methodologies such as Agile, Scrum, testing automation and Continuous Integration and Deployment. ...

Posted 4 months ago

AI Match Score
Apply

5.0 - 9.0 years

8 - 12 Lacs

Noida

Work from Office

5-9 years In Data Engineering, software development such as ELT/ETL, data extraction and manipulation in Data Lake/Data Warehouse environment Expert level Hands to the following: Python, SQL PySpark DBT and Apache Airflow DevOps, Jenkins, CI/CD Data Governance and Data Quality frameworks Data Lakes, Data Warehouse AWS services including S3, SNS, SQS, Lambda, EMR, Glue, Athena, EC2, VPC etc. Source code control - GitHub, VSTS etc. Mandatory Competencies Python - Python Database - SQL Data on Cloud - AWS S3 DevOps - CI/CD DevOps - Github ETL - AWS Glue Beh - Communication

Posted 4 months ago

AI Match Score
Apply

3.0 - 8.0 years

15 - 30 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 15 to 30 LPA Exp: 3 to 8 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 3+ years of experience in analytics, Pyspark, Python, Spark, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and p...

Posted 4 months ago

AI Match Score
Apply

0.0 - 3.0 years

1 - 4 Lacs

Bengaluru

Work from Office

Job Title: Data Engineer Company Name: Kinara Capital Job Description: As a Data Engineer at Kinara Capital, you will play a critical role in building and maintaining the data infrastructure necessary for effective data analysis and decision-making. You will collaborate with data scientists, analysts, and other stakeholders to support data-driven initiatives. Your primary responsibilities will include designing and implementing robust data pipelines, ensuring data quality and integrity, and optimizing data storage and retrieval processes. Key Responsibilities: - Develop, construct, test, and maintain data architectures including databases and large-scale processing systems. - Create and mana...

Posted 4 months ago

AI Match Score
Apply

6.0 - 11.0 years

20 - 35 Lacs

Gurugram

Hybrid

Must-Have Skills (Core Requirements) Look for resumes that mention hands-on experience with: Amazon S3 storing and organizing data AWS Glue – running ETL jobs (basic PySpark knowledge is a plus) Glue Catalog – maintaining metadata for datasets Amazon Athena – querying data using SQL Parquet or CSV – basic familiarity with data file formats AWS Lambda – for simple automation or triggers Basic IAM knowledge – setting up access permissions CloudWatch – monitoring jobs or logs Understanding of ETL/ELT pipelines Good-to-Have Skills (Preferred but not mandatory) These add value, but are not essential at this level: AWS Lake Formation – access control and permissions Apache Airflow or Step Function...

Posted 4 months ago

AI Match Score
Apply

2.0 - 5.0 years

10 - 17 Lacs

Chennai, Bengaluru

Work from Office

Role & responsibilities Have strong data engineering knowledge and cloud development exposure. Proficiency in python Proficiency in both RDBMS (Mysql preferred) and NoSql datastores Spark, Cassandra, AWS data pipeline stack (Athena, S3, Glue data catalog etc.), Airflow etc are technologies you have used. Very comfortable with data lakes, warehouses, and ETL/ELT paradigms Worked in an agile development environment.• Optional Basic knowledge of statistical analysis, mathematical modelling and machine learning Experience Have used or are very hands-on with Microservices, Docker, Kubernetes, Gradle/Ant,Kafka, GIT/bitbucket in an agile workplace. Develop high quality code with strong unit/integra...

Posted 4 months ago

AI Match Score
Apply

2.0 - 4.0 years

10 - 11 Lacs

Pune

Work from Office

Role & responsibilities Have strong data engineering knowledge and cloud development exposure. Proficiency in python Proficiency in both RDBMS (Mysql preferred) and NoSql datastores Spark, Cassandra, AWS data pipeline stack (Athena, S3, Glue data catalog etc.), Airflow etc are technologies you have used. Very comfortable with data lakes, warehouses, and ETL/ELT paradigms Worked in an agile development environment.• Optional Basic knowledge of statistical analysis, mathematical modelling and machine learning Experience Have used or are very hands-on with Microservices, Docker, Kubernetes, Gradle/Ant,Kafka, GIT/bitbucket in an agile workplace. Develop high quality code with strong unit/integra...

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Hiring for a FAANG company. Note: This position is part of a program designed to support women professionals returning to the workforce after a career break (9+ months career gap) About the Role Join a high-impact global business team that is building cutting-edge B2B technology solutions. As part of a structured returnship program, this role is ideal for experienced professionals re-entering the workforce after a career break. Youll work on mission-critical data infrastructure in one of the worlds largest cloud-based environments, helping transform enterprise procurement through intelligent architecture and scalable analytics. This role merges consumer-grade experience with enterprise-grade...

Posted 4 months ago

AI Match Score
Apply

4.0 - 9.0 years

14 - 18 Lacs

Bengaluru

Work from Office

About the Role: Platform Product Owner Data Pipelines Were looking for a product-driven, data-savvy Platform Product Owner to lead the evolution of Hevos Data Pipelines Platform. This role blends strategic product thinking with operational excellence and offers full ownershipfrom defining product outcomes to driving delivery health and platform reliability. Youll work closely with Engineering, Architecture, and cross-functional teams to shape the platform roadmap, define user value, and ensure successful outcomes through measurable impact. If you're passionate about building scalable, high-impact data productsand excel at balancing strategy with executionthis role is for you. Key Responsibil...

Posted 4 months ago

AI Match Score
Apply

7.0 - 12.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Your Job As a Data Engineer you will be a part of an team that designs, develops, and delivers Data Pipelines and Data Analytics Solutions for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Global Solution India (KGSI) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KSGI rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floo...

Posted 4 months ago

AI Match Score
Apply

4.0 - 8.0 years

2 - 6 Lacs

Noida

Work from Office

We are looking for a skilled SQL + Pyspark Developer with 4 to 8 years of experience. The ideal candidate should have strong proficiency in SQL and Pyspark/Python, with the ability to work effectively in a team. Roles and Responsibility Design, develop, and implement data models using SQL and Pyspark/Python. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data systems using Apache Spark and AWS Glue ETL. Ensure data quality and integrity by implementing data validation and testing procedures. Optimize database performance and troubleshoot issues. Participate in code reviews and contribute to improving overall code qual...

Posted 4 months ago

AI Match Score
Apply

8.0 - 10.0 years

5 - 9 Lacs

Noida

Work from Office

We are looking for a skilled Power BI Dashboarding and Visualization Developer with 8 to 10 years of experience. The ideal candidate will have a strong background in designing and developing interactive dashboards and visualizations using Power BI, as well as integrating and optimizing Power BI solutions within cloud environments. Roles and Responsibility Design and develop interactive dashboards and visualizations using Power BI. Integrate and optimize Power BI solutions within AWS and Azure environments. Collaborate with business users to gather requirements and deliver insights. Ensure data accuracy, security, and performance. Develop and maintain complex data models and reports using Pow...

Posted 4 months ago

AI Match Score
Apply

3.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

We are looking for a skilled Data Engineer with 3 to 6 years of experience in processing data pipelines using Databricks, PySpark, and SQL on Cloud distributions like AWS. The ideal candidate should have hands-on experience with Databricks, Spark, SQL, and AWS Cloud platform, especially S3, EMR, Databricks, Cloudera, etc. Roles and Responsibility Design and develop large-scale data pipelines using Databricks, Spark, and SQL. Optimize data operations using Databricks and Python. Develop solutions to meet business needs reflecting a clear understanding of the objectives, practices, and procedures of the corporation, department, and business unit. Evaluate alternative risks and solutions before...

Posted 4 months ago

AI Match Score
Apply

6.0 - 8.0 years

2 - 6 Lacs

Pune

Work from Office

We are looking for a skilled Python AWS Developer with 6 to 8 years of experience. The ideal candidate will have expertise in developing scalable and efficient applications on the AWS platform. Roles and Responsibility Design, develop, and deploy scalable and efficient applications on the AWS platform. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop high-quality code that meets industry standards and best practices. Troubleshoot and resolve technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Stay updated with the latest trends and technologies in Python and AWS development. Job Strong pro...

Posted 4 months ago

AI Match Score
Apply

4.0 - 8.0 years

6 - 10 Lacs

Chennai

Work from Office

Role profile: > Job Role: Technology Lead Location: Chennai Position Type: Full-Time About the Role Job Summary As a Technology Lead, you will play a pivotal role in leading technological initiatives within the organization. You will be responsible for overseeing the development, implementation, and maintenance of technology solutions that align with the companys objectives and drive innovation. This role requires a strong blend of technical expertise, leadership skills, and strategic thinking. Function: ETL Services Skills:: Software development experience (4-8 years) Database knowledge (SQL, AWS glue, Python , Pyspark, postgres etc.) People management Preferred knowledge on Devops Tools Mi...

Posted 4 months ago

AI Match Score
Apply

8.0 - 12.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Lead design, development, and deployment of cloud-native and hybrid solutions on AWS and GCP. Ensure robust infrastructure using services like GKE, GCE, Cloud Functions, Cloud Run (GCP) and EC2, Lambda, ECS, S3, etc. (AWS).

Posted 4 months ago

AI Match Score
Apply

6.0 - 11.0 years

25 - 40 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

-Design, build & deployment of cloud-native and hybrid solutions on AWS and GCP -Exp in Glue, Athena, PySpark & Step function, Lambda, SQL, ETL, DWH, Python, EC2, EBS/EFS, CloudFront, Cloud Functions, Cloud Run (GCP), GKE, GCE, EC2, ECS, S3, etc

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 25 Lacs

Gurugram

Work from Office

Required Desired Prior experience with writing and debugging python Prior experience with building data pipelines. Prior experience Data lakes in an aws environment Prior experience with Data warehouse technologies in an aws environment Prior experience with AWS EMR Prior experince with pyspark Candidate should have prior experience with AWS and Azure. Additional Cloud-based tools experience is important (see skills section) Additional desired skills include experience with the following: Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with Python and experience with ...

Posted 4 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies