Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
4 - 7 Lacs
karnataka
Work from Office
Role:Sr Python Developer Must have:Python Spark/Postgres SQL ,Spark / Apache Spark (AWS) Strong development experience using Python and SQL on AWS using Glue, Lambda.
Posted 3 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
tamil nadu
Work from Office
Description Responsibilities Must have 1. Strong expertise in SQL Python PySpark. 2. Good knowledge on Data warehousing techniques. 3. Good knowledge on AWS Big Data services and snowflake. Design develop and maintain scalable data pipelines and architectures for data processing and integration. Implement data streaming solutions to handle real-time data ingestion and processing. Utilize Python and PySpark to develop and optimize data workflows. Leverage AWS services such as S3 Redshift Glue Kinesis and Lambda for data storage processing and analytics. Collaborate with data scientists analysts and other stakeholders to understand data requirements and deliver solutions. Ensure data quality i...
Posted 3 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
andhra pradesh
Work from Office
Description 1.Hands on industry experience in design and coding from scratch in AWS Glue-Pyspark with services like S3 DynamoDB StepFunctions etc. 2.Hands on industry experience in design and coding from scratch in Snowflake 3.Experience in Pyspark/Snowflake 1 to 3 years with overall around 5 years of experience in building data/analytics solutions Level Senior Consultant or below Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility Yes Global Role Family 60236 (P) Software Engineering Local Role Name 6361 Software Engineer Local Skills 59383 AWS G...
Posted 3 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
andhra pradesh
Work from Office
Description Primary/Mandatory Experience - Experience building data pipeline in Python along with AWS services (S3 SNS CloudWatch Lambda Step Function etc) - Proficient in AWS Server less technologies - Technical knowledge of Extract/Transform/Load (ETL) solutions for running analytic projects on the cloud - Candidate must have hands-on technical experience on the AWS cloud native technologies along with traditional ETL tools -Snowflake & DWH experience is a plus Daily Activity Excellent written and verbal communication skills and be able to lead meetings with technical peers and clients regarding the solution designs. Ability to communicate with the Business Analysts Data Modellers Cloud Ar...
Posted 3 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
andhra pradesh
Work from Office
Description Primary/Mandatory Experience - Experience building data pipeline in Python along with AWS services (S3 SNS CloudWatch Lambda Step Function etc) - Proficient in AWS Server less technologies - Technical knowledge of Extract/Transform/Load (ETL) solutions for running analytic projects on the cloud - Candidate must have hands-on technical experience on the AWS cloud native technologies along with traditional ETL tools -Snowflake & DWH experience is a plus Daily Activity Excellent written and verbal communication skills and be able to lead meetings with technical peers and clients regarding the solution designs. Ability to communicate with the Business Analysts Data Modellers Cloud Ar...
Posted 3 weeks ago
9.0 - 14.0 years
20 - 25 Lacs
bengaluru
Hybrid
Sr. Data Engineer About the Role The Senior Data Engineer will play a pivotal role in designing, developing, and maintaining scalable, high-performance data pipelines and infrastructure. You will lead multiple data engineering initiatives, collaborate with cross-functional teams, and deliver enterprise-grade data solutions aligned with business goals. This role is ideal for a proactive problem solver with strong leadership skills, hands-on expertise in PySpark, AWS Glue, Databricks, and S3 , and the ability to translate complex technical concepts into actionable insights for business stakeholders. Key Responsibilities Lead functional teams and manage multiple data engineering projects simult...
Posted 3 weeks ago
2.0 - 5.0 years
6 - 10 Lacs
chennai
Work from Office
Skills Required: Should have a minimum 2+ years in Data Engineering, Data Analytics platform. Should have strong hands-on design and engineering background in AWS, across a wide range of AWS services with the ability to demonstrate working on large engagements. Strong experience with Python, SQL, pySpark, Scala, Shell Scripting etc. Strong experience with workflow management & Orchestration tools (Airflow) Build and manage ETL workflows on AWS Glue Develop and orchestrate data workflows using Apache Airflow Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements Optimize existing data solutions for performance and scalability Ensure data qua...
Posted 3 weeks ago
3.0 - 5.0 years
0 Lacs
madurai
Work from Office
Immediate Joiners & Ready to Relocate to Madurai Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data processing pipelines using PySpark on Azure Databricks. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions that meet business needs. Troubleshoot complex issues related to big data processing, ETL processes, and SQL queries. Ensure scalability, performance, and reliability of the system by monitoring logs, debugging techniques, and implementing optimization strategies. Stay up-to-date with industry trends in big data technologies such as Hadoop ecosystem (HDFS), Spark (PySpark), AWS Glue.
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
As a Data Engineer at Tredence, you will be responsible for working with various AWS technologies like Glue, Redshift, S3 Tables, and PySpark to build and maintain data pipelines. Your expertise in AWS services will be crucial in ensuring smooth data processing and analysis for the organization. - Utilize AWS Glue and Glue catalogue for data extraction, transformation, and loading processes. - Work with AWS Redshift/Redshift Spectrum to manage and optimize data warehousing solutions. - Implement Managed Airflow/Step Function for orchestrating data workflows efficiently. - Create and manage AWS S3 Tables/Iceberg/Athena for storing and querying large datasets. - Develop data processing tasks u...
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
As an SQL ETL Snowflake Developer, you will leverage your expertise in advanced SQL with a focus on Snowflake. Your primary role involves working in the data integration area and utilizing ETL tools such as Datastage, Informatica, or Snaplogic. Your ability to translate technical requirements into data collection queries is crucial. Collaboration with business and IT teams to convert requirements into queries and a strong understanding of ETL architecture and design are key aspects of your role. - Utilize advanced SQL skills with a specialization in Snowflake for data integration tasks - Use ETL tools like Datastage, Informatica, or Snaplogic to transform technical requirements into data col...
Posted 3 weeks ago
5.0 - 7.0 years
17 - 19 Lacs
chennai
Work from Office
Job Description: Cloud platform: AWS, Snowflake AWS: Athena, AWS Glue, Glue workflow, Lambda, S3, EC2, EBS, CloudWatch, VPC, DynamoDB, API Gateway, IAM, CloudFormation, Kinesis, SQS, SNS, Step Functions, QuickSight, Redshift. Programming languages: Python, PySpark, SQL, Java Scheduler tool: Talend, Airflow Infrastructure as a code tool: Terraform Ticketing Tools: Jira Operating System: Linux, Windows. Responsibilities: Perform data engineering activities that include data modeling, analysis, cleansing, processing, extraction and transformation. Build batch or near real time ETL data pipelines using Talent, AWS Glue, Lambda, Kinesis, SQS. Write ETL scripts using Python/Pyspark and SQL. Also, ...
Posted 3 weeks ago
6.0 - 11.0 years
19 - 34 Lacs
bangalore rural
Hybrid
Primarily looking for a Data Engineer with expertise in processing data pipelines using Databricks PySpark SQL on Cloud distributions like AWS Must have AWS Databricks Good to have PySpark Snowflake Talend
Posted 3 weeks ago
6.0 - 11.0 years
19 - 34 Lacs
chennai
Hybrid
Primarily looking for a Data Engineer with expertise in processing data pipelines using Databricks PySpark SQL on Cloud distributions like AWS Must have AWS Databricks Good to have PySpark Snowflake Talend
Posted 3 weeks ago
6.0 - 11.0 years
19 - 34 Lacs
pune
Hybrid
Primarily looking for a Data Engineer with expertise in processing data pipelines using Databricks PySpark SQL on Cloud distributions like AWS Must have AWS Databricks Good to have PySpark Snowflake Talend
Posted 3 weeks ago
6.0 - 11.0 years
19 - 34 Lacs
hyderabad
Hybrid
Primarily looking for a Data Engineer with expertise in processing data pipelines using Databricks PySpark SQL on Cloud distributions like AWS Must have AWS Databricks Good to have PySpark Snowflake Talend
Posted 3 weeks ago
4.0 - 6.0 years
2 - 5 Lacs
bengaluru
Hybrid
We are seeking an experienced Python Developer with strong AWS expertise to design, develop, and maintain scalable cloud-based data solutions. The ideal candidate will have hands-on experience in Python programming, AWS services, and SQL , along with a good understanding of data pipelines and automation . Key Responsibilities: Develop and deploy data processing and automation workflows using Python and AWS services. Design, implement, and manage solutions using AWS Glue, Lambda, S3, and CloudFormation/AWS Stacks . Write optimized SQL queries and manage data transformations and integrations. Collaborate with data engineering and DevOps teams to ensure scalability and reliability of systems. I...
Posted 3 weeks ago
12.0 - 20.0 years
30 - 37 Lacs
chennai
Remote
Job Title: Senior Senior Data Engineer AWS & Data Lakehouse Specialist Job Location: Madurai Working Mode: Remote Experience: 7+ Years Working Hours: 2 PM to 11 PM IST Interview Process: 2- 3 rounds of interviews from Tech Mango + 1 Level of Client Interview. Job Title: Senior Data Engineer AWS & Data Lakehouse Specialist Requirement Summary: We are seeking a highly skilled and motivated Senior Data Engineer with overall 8+ years of experience and 4–5 years of hands-on experience in building scalable data solutions on AWS. The ideal candidate will be proficient in Python, PySpark, and AWS Glue, with a strong understanding of Data Lakehouse architecture—especially the medallion model. You wil...
Posted 3 weeks ago
7.0 - 12.0 years
25 - 30 Lacs
mumbai
Remote
PLEASE APPLY IF YOU CAN JOIN IMMEDIATELY AND HAVE 7+ YRS AWS DATA ENGINEER EXPERIENCE WITH SQL AND GIT Job Description: We are seeking a skilled Data Engineer with 7+ years of experience in data processing, ETL pipelines, and cloud-based data solutions. The ideal candidate will have strong expertise in AWS Glue, Redshift, S3, EMR, and Lambda , SQL, Stored Procedures with hands-on experience using Python and PySpark for large-scale data transformations. The candidate will be responsible for designing, building, and maintaining scalable data pipelines and systems to support analytics and data-driven decision-making. Additionally, need to have strong expertise in Terraform and Git-based CI/CD p...
Posted 3 weeks ago
6.0 - 9.0 years
15 - 25 Lacs
kolkata
Work from Office
Skill: AWS Glue Experience: 6 to 9 years Location: Kolkata Job description Technical Skills : AWS Glue: 3+ years of hands-on experience in AWS Glue ETL development Python/PySpark: Strong programming skills in Python and PySpark for data transformation AWS Services: Proficiency in S3, Redshift, Athena, Lambda, and EMR Data Formats: Experience with Parquet, Avro, JSON, CSV, and ORC file formats SQL: Advanced SQL skills for data querying and transformation ETL Concepts: Deep understanding of ETL/ELT design patterns and best practices Data Modeling: Knowledge of dimensional modeling, star/snowflake schemas Version Control: Experience with Git/Bitbucket for code management Preferred Skills: Exper...
Posted 3 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
karnataka
Work from Office
Responsibilities: Develop and implement data governance frameworks and policies to ensure data trust. Establish data quality standards metrics and processes for monitoring and improving data integrity. Implement data privacy measures and ensure compliance with regulations. Collaborate with cross-functional teams to define data stewardship roles and responsibilities. Conduct regular audits and assessments to maintain data quality and security. Implement data management tools and technologies to support data governance initiatives. Provide training and guidance on data trust best practices to internal stakeholders. Stay informed on industry trends and best practices in data governance and priv...
Posted 3 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
karnataka
Work from Office
Description: Responsibilities: Develop and implement data governance frameworks and policies to ensure data trust. Establish data quality standards metrics and processes for monitoring and improving data integrity. Implement data privacy measures and ensure compliance with regulations. Collaborate with cross-functional teams to define data stewardship roles and responsibilities. Conduct regular audits and assessments to maintain data quality and security. Implement data management tools and technologies to support data governance initiatives. Provide training and guidance on data trust best practices to internal stakeholders. Stay informed on industry trends and best practices in data govern...
Posted 3 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
karnataka
Work from Office
Location : Any PSL Location JD - DBT/AWS Glue/Python/Pyspark Hands-on experience in data engineering, with expertise in DBT/AWS Glue/Python/Pyspark. Strong knowledge of data engineering concepts, data pipelines, ETL/ELT processes, and cloud data environments (AWS) Technology DBT, AWS Glue, Athena, SQL, Spark, PySpark Good understanding of Spark internals and how it works. Goot skills in PySpark Good understanding of DBT basically should be to understand DBT limitations and when it will end-up in model explosion Good hands-on experience in AWS Glue AWS expertise should know different services and should know how to configure them and infra-as-code experience Basic understanding of different o...
Posted 3 weeks ago
1.0 - 2.0 years
2 - 5 Lacs
karnataka
Work from Office
EXP 4 to 6 yrs Location Any PSL Location Rate below 14$ JD - DBT/AWS Glue/Python/Pyspark Hands-on experience in data engineering, with expertise in DBT/AWS Glue/Python/Pyspark. Strong knowledge of data engineering concepts, data pipelines, ETL/ELT processes, and cloud data environments (AWS) Technology DBT, AWS Glue, Athena, SQL, Spark, PySpark Good understanding of Spark internals and how it works. Goot skills in PySpark Good understanding of DBT basically should be to understand DBT limitations and when it will end-up in model explosion Good hands-on experience in AWS Glue AWS expertise should know different services and should know how to configure them and infra-as-code experience Basic ...
Posted 3 weeks ago
2.0 - 5.0 years
3 - 7 Lacs
karnataka
Work from Office
EXP 4 to 6 yrs Location Any PSL Location Rate below 14$ JD - DBT/AWS Glue/Python/Pyspark Hands-on experience in data engineering, with expertise in DBT/AWS Glue/Python/Pyspark. Strong knowledge of data engineering concepts, data pipelines, ETL/ELT processes, and cloud data environments (AWS) Technology DBT, AWS Glue, Athena, SQL, Spark, PySpark Good understanding of Spark internals and how it works. Goot skills in PySpark Good understanding of DBT basically should be to understand DBT limitations and when it will end-up in model explosion Good hands-on experience in AWS Glue AWS expertise should know different services and should know how to configure them and infra-as-code experience Basic ...
Posted 3 weeks ago
5.0 - 8.0 years
6 - 9 Lacs
pune
Work from Office
Skills Required: Strong proficiency in Pyspark , Scala , and Python Experience with AWS Glue Experience Required: Minimum 5 years of relevant experience Location: Available across all UST locations Notice Period: Immediate joiners
Posted 3 weeks ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
128529 Jobs | Dublin
Wipro
41046 Jobs | Bengaluru
EY
33823 Jobs | London
Accenture in India
30977 Jobs | Dublin 2
Uplers
24932 Jobs | Ahmedabad
Turing
23421 Jobs | San Francisco
IBM
20492 Jobs | Armonk
Infosys
19613 Jobs | Bangalore,Karnataka
Capgemini
19528 Jobs | Paris,France
Accenture services Pvt Ltd
19518 Jobs |