1847 Aws Glue Jobs - Page 31

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

As a Data Engineer at Tredence, you will be responsible for working with various AWS technologies like Glue, Redshift, S3 Tables, and PySpark to build and maintain data pipelines. Your expertise in AWS services will be crucial in ensuring smooth data processing and analysis for the organization. - Utilize AWS Glue and Glue catalogue for data extraction, transformation, and loading processes. - Work with AWS Redshift/Redshift Spectrum to manage and optimize data warehousing solutions. - Implement Managed Airflow/Step Function for orchestrating data workflows efficiently. - Create and manage AWS S3 Tables/Iceberg/Athena for storing and querying large datasets. - Develop data processing tasks u...

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As an SQL ETL Snowflake Developer, you will leverage your expertise in advanced SQL with a focus on Snowflake. Your primary role involves working in the data integration area and utilizing ETL tools such as Datastage, Informatica, or Snaplogic. Your ability to translate technical requirements into data collection queries is crucial. Collaboration with business and IT teams to convert requirements into queries and a strong understanding of ETL architecture and design are key aspects of your role. - Utilize advanced SQL skills with a specialization in Snowflake for data integration tasks - Use ETL tools like Datastage, Informatica, or Snaplogic to transform technical requirements into data col...

Posted 2 months ago

AI Match Score
Apply

5.0 - 7.0 years

17 - 19 Lacs

chennai

Work from Office

Job Description: Cloud platform: AWS, Snowflake AWS: Athena, AWS Glue, Glue workflow, Lambda, S3, EC2, EBS, CloudWatch, VPC, DynamoDB, API Gateway, IAM, CloudFormation, Kinesis, SQS, SNS, Step Functions, QuickSight, Redshift. Programming languages: Python, PySpark, SQL, Java Scheduler tool: Talend, Airflow Infrastructure as a code tool: Terraform Ticketing Tools: Jira Operating System: Linux, Windows. Responsibilities: Perform data engineering activities that include data modeling, analysis, cleansing, processing, extraction and transformation. Build batch or near real time ETL data pipelines using Talent, AWS Glue, Lambda, Kinesis, SQS. Write ETL scripts using Python/Pyspark and SQL. Also, ...

Posted 2 months ago

AI Match Score
Apply

6.0 - 11.0 years

19 - 34 Lacs

bangalore rural

Hybrid

Primarily looking for a Data Engineer with expertise in processing data pipelines using Databricks PySpark SQL on Cloud distributions like AWS Must have AWS Databricks Good to have PySpark Snowflake Talend

Posted 2 months ago

AI Match Score
Apply

6.0 - 11.0 years

19 - 34 Lacs

chennai

Hybrid

Primarily looking for a Data Engineer with expertise in processing data pipelines using Databricks PySpark SQL on Cloud distributions like AWS Must have AWS Databricks Good to have PySpark Snowflake Talend

Posted 2 months ago

AI Match Score
Apply

6.0 - 11.0 years

19 - 34 Lacs

pune

Hybrid

Primarily looking for a Data Engineer with expertise in processing data pipelines using Databricks PySpark SQL on Cloud distributions like AWS Must have AWS Databricks Good to have PySpark Snowflake Talend

Posted 2 months ago

AI Match Score
Apply

6.0 - 11.0 years

19 - 34 Lacs

hyderabad

Hybrid

Primarily looking for a Data Engineer with expertise in processing data pipelines using Databricks PySpark SQL on Cloud distributions like AWS Must have AWS Databricks Good to have PySpark Snowflake Talend

Posted 2 months ago

AI Match Score
Apply

4.0 - 6.0 years

2 - 5 Lacs

bengaluru

Hybrid

We are seeking an experienced Python Developer with strong AWS expertise to design, develop, and maintain scalable cloud-based data solutions. The ideal candidate will have hands-on experience in Python programming, AWS services, and SQL , along with a good understanding of data pipelines and automation . Key Responsibilities: Develop and deploy data processing and automation workflows using Python and AWS services. Design, implement, and manage solutions using AWS Glue, Lambda, S3, and CloudFormation/AWS Stacks . Write optimized SQL queries and manage data transformations and integrations. Collaborate with data engineering and DevOps teams to ensure scalability and reliability of systems. I...

Posted 2 months ago

AI Match Score
Apply

12.0 - 20.0 years

30 - 37 Lacs

chennai

Remote

Job Title: Senior Senior Data Engineer AWS & Data Lakehouse Specialist Job Location: Madurai Working Mode: Remote Experience: 7+ Years Working Hours: 2 PM to 11 PM IST Interview Process: 2- 3 rounds of interviews from Tech Mango + 1 Level of Client Interview. Job Title: Senior Data Engineer AWS & Data Lakehouse Specialist Requirement Summary: We are seeking a highly skilled and motivated Senior Data Engineer with overall 8+ years of experience and 4–5 years of hands-on experience in building scalable data solutions on AWS. The ideal candidate will be proficient in Python, PySpark, and AWS Glue, with a strong understanding of Data Lakehouse architecture—especially the medallion model. You wil...

Posted 2 months ago

AI Match Score
Apply

7.0 - 12.0 years

25 - 30 Lacs

mumbai

Remote

PLEASE APPLY IF YOU CAN JOIN IMMEDIATELY AND HAVE 7+ YRS AWS DATA ENGINEER EXPERIENCE WITH SQL AND GIT Job Description: We are seeking a skilled Data Engineer with 7+ years of experience in data processing, ETL pipelines, and cloud-based data solutions. The ideal candidate will have strong expertise in AWS Glue, Redshift, S3, EMR, and Lambda , SQL, Stored Procedures with hands-on experience using Python and PySpark for large-scale data transformations. The candidate will be responsible for designing, building, and maintaining scalable data pipelines and systems to support analytics and data-driven decision-making. Additionally, need to have strong expertise in Terraform and Git-based CI/CD p...

Posted 2 months ago

AI Match Score
Apply

6.0 - 9.0 years

15 - 25 Lacs

kolkata

Work from Office

Skill: AWS Glue Experience: 6 to 9 years Location: Kolkata Job description Technical Skills : AWS Glue: 3+ years of hands-on experience in AWS Glue ETL development Python/PySpark: Strong programming skills in Python and PySpark for data transformation AWS Services: Proficiency in S3, Redshift, Athena, Lambda, and EMR Data Formats: Experience with Parquet, Avro, JSON, CSV, and ORC file formats SQL: Advanced SQL skills for data querying and transformation ETL Concepts: Deep understanding of ETL/ELT design patterns and best practices Data Modeling: Knowledge of dimensional modeling, star/snowflake schemas Version Control: Experience with Git/Bitbucket for code management Preferred Skills: Exper...

Posted 2 months ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

karnataka

Work from Office

Responsibilities: Develop and implement data governance frameworks and policies to ensure data trust. Establish data quality standards metrics and processes for monitoring and improving data integrity. Implement data privacy measures and ensure compliance with regulations. Collaborate with cross-functional teams to define data stewardship roles and responsibilities. Conduct regular audits and assessments to maintain data quality and security. Implement data management tools and technologies to support data governance initiatives. Provide training and guidance on data trust best practices to internal stakeholders. Stay informed on industry trends and best practices in data governance and priv...

Posted 2 months ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

karnataka

Work from Office

Description: Responsibilities: Develop and implement data governance frameworks and policies to ensure data trust. Establish data quality standards metrics and processes for monitoring and improving data integrity. Implement data privacy measures and ensure compliance with regulations. Collaborate with cross-functional teams to define data stewardship roles and responsibilities. Conduct regular audits and assessments to maintain data quality and security. Implement data management tools and technologies to support data governance initiatives. Provide training and guidance on data trust best practices to internal stakeholders. Stay informed on industry trends and best practices in data govern...

Posted 2 months ago

AI Match Score
Apply

4.0 - 6.0 years

6 - 8 Lacs

karnataka

Work from Office

Location : Any PSL Location JD - DBT/AWS Glue/Python/Pyspark Hands-on experience in data engineering, with expertise in DBT/AWS Glue/Python/Pyspark. Strong knowledge of data engineering concepts, data pipelines, ETL/ELT processes, and cloud data environments (AWS) Technology DBT, AWS Glue, Athena, SQL, Spark, PySpark Good understanding of Spark internals and how it works. Goot skills in PySpark Good understanding of DBT basically should be to understand DBT limitations and when it will end-up in model explosion Good hands-on experience in AWS Glue AWS expertise should know different services and should know how to configure them and infra-as-code experience Basic understanding of different o...

Posted 2 months ago

AI Match Score
Apply

1.0 - 2.0 years

2 - 5 Lacs

karnataka

Work from Office

EXP 4 to 6 yrs Location Any PSL Location Rate below 14$ JD - DBT/AWS Glue/Python/Pyspark Hands-on experience in data engineering, with expertise in DBT/AWS Glue/Python/Pyspark. Strong knowledge of data engineering concepts, data pipelines, ETL/ELT processes, and cloud data environments (AWS) Technology DBT, AWS Glue, Athena, SQL, Spark, PySpark Good understanding of Spark internals and how it works. Goot skills in PySpark Good understanding of DBT basically should be to understand DBT limitations and when it will end-up in model explosion Good hands-on experience in AWS Glue AWS expertise should know different services and should know how to configure them and infra-as-code experience Basic ...

Posted 2 months ago

AI Match Score
Apply

2.0 - 5.0 years

3 - 7 Lacs

karnataka

Work from Office

EXP 4 to 6 yrs Location Any PSL Location Rate below 14$ JD - DBT/AWS Glue/Python/Pyspark Hands-on experience in data engineering, with expertise in DBT/AWS Glue/Python/Pyspark. Strong knowledge of data engineering concepts, data pipelines, ETL/ELT processes, and cloud data environments (AWS) Technology DBT, AWS Glue, Athena, SQL, Spark, PySpark Good understanding of Spark internals and how it works. Goot skills in PySpark Good understanding of DBT basically should be to understand DBT limitations and when it will end-up in model explosion Good hands-on experience in AWS Glue AWS expertise should know different services and should know how to configure them and infra-as-code experience Basic ...

Posted 2 months ago

AI Match Score
Apply

5.0 - 8.0 years

6 - 9 Lacs

pune

Work from Office

Skills Required: Strong proficiency in Pyspark , Scala , and Python Experience with AWS Glue Experience Required: Minimum 5 years of relevant experience Location: Available across all UST locations Notice Period: Immediate joiners

Posted 2 months ago

AI Match Score
Apply

3.0 - 5.0 years

3 - 7 Lacs

gurugram

Work from Office

Skills: 3-5 years experience on AWS Glue, Python and Snowflake. Technical Skills - AWS Glue: Proficiency in developing and managing Spark-based Glue jobs. - Snowflake: Strong experience in Snowflake data warehousing, including schema design, query optimization, and performance tuning. - Kafka: Hands-on experience with Kafka for real-time data streaming and integration. - AWS Lambda: Experience in building serverless functions for event-driven workflows. - Programming: Proficiency in Python - SQL: Advanced SQL skills for data transformation and analysis. - Cloud Infrastructure: Familiarity with AWS services such as S3, CloudWatch, SES and IAM. - CI/CD: Familiarity with Terraform and Git actio...

Posted 2 months ago

AI Match Score
Apply

4.0 - 7.0 years

20 - 30 Lacs

pune, gurugram, bengaluru

Work from Office

Roles and Responsibilities Design, develop, and maintain data governance frameworks, policies, and procedures to ensure compliance with regulatory requirements. Collaborate with stakeholders to identify business needs and implement solutions using AWS Glue, Pyspark, SQL, and other tools. Develop automated data quality checks using AWS services such as S3 Bucket Policies and Lambda Functions. Ensure adherence to company standards for data management best practices through regular audits and monitoring. Provide training and support to team members on new technologies and processes. Desired Candidate Profile 4-7 years of experience in Data Management or related field (Data Engineering). B.Tech/...

Posted 2 months ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Role Overview: At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. We are counting on your unique voice and perspective to help EY become even better. Join us to build an exceptional experience for yourself and contribute to creating a better working world for all. Key Responsibilities: - Develop, deploy, and monitor machine learning models in production environments. - Automate ML pipelines for model training, validation, and deployment. - Optimize ML model performance, scalability, and cost efficiency. - Implement CI/CD workflows for ML model versioning, testing, and deplo...

Posted 2 months ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As an AWS Developer at Barclays, you will be responsible for supporting the successful delivery of Location Strategy projects within the planned budget, agreed quality, and governance standards. You will play a key role in evolving the digital landscape, driving innovation, and excellence. By harnessing cutting-edge technology, you will revolutionize digital offerings to ensure unparalleled customer experiences. - Experience with AWS cloud services such as Lambda, S3, Step Function, data analytics services like Athena, Redshift, Snowflake. - Proficiency in programming languages like Python/Pandas, Spark. - Hands-on experience in handling high volumes of data processing using AWS Glue, PySpar...

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

gandhinagar, gujarat

On-site

As a Data Engineer, you will be responsible for managing and optimizing data processes to ensure efficient data flow within the organization. You will utilize your 4+ years of experience to design and implement data solutions using SQL. Working hands-on with modern cloud data warehousing solutions such as Snowflake, Big Query, and Redshift is a key part of your role. Additionally, you will demonstrate expertise in ETL/ELT processes for batch and streaming data processing, troubleshoot data issues effectively, and propose solutions for data optimization. You will also apply your knowledge of AWS services including S3, DMS, Glue, and Athena. Familiarizing yourself with DBT for data transformat...

Posted 2 months ago

AI Match Score
Apply

6.0 - 10.0 years

2 - 6 Lacs

hyderabad

Work from Office

Roles and responsibilities Design AWS architectures based on business requirements. Create architectural diagrams and documentation. Present cloud solutions to stakeholders. Skills and Qualifications: Strong knowledge of AWS services: EC2, S3, RDS, Lambda, IAM, VPC, CloudFormation, CloudWatch, etc. Experience with serverless architectures (AWS Lambda, API Gateway, Step Functions). Experience of AWS networking (VPC, Route 53, ELB, Security Groups, etc.). Experience with AWS CloudFormation for automating infrastructure. Proficiency in scripting languages such as Python or Bash. Experience with automation tools (AWS Systems Manager, AWS Lambda) Experience of containerization (Docker, Kubernetes...

Posted 2 months ago

AI Match Score
Apply

3.0 - 5.0 years

12 - 16 Lacs

kochi

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developin...

Posted 2 months ago

AI Match Score
Apply

5.0 - 8.0 years

10 - 14 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS BigData Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also en...

Posted 2 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies