Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 13.0 years
18 - 33 Lacs
hyderabad
Work from Office
Design and implement end-to-end ETL/ELT pipelines using Databricks and Snowflake across structured, semi-structured, and unstructured data formats Python, PySpark, Databricks, Snowflake, SQL
Posted 4 days ago
6.0 - 8.0 years
4 - 8 Lacs
chennai, bengaluru
Work from Office
Skills: Complex Databricks implementations,Architect level certification from Databricks, Python, Scala and SQL,Strong depth in Lakehouse Architecture with Delta Tables, Expertise in Spark ecosystems, Data frame API, spark streaming, Data set API, RDD APIs & Spark SQL.,Expertise Notice Period: Immediate -45 days
Posted 4 days ago
5.0 - 8.0 years
5 - 11 Lacs
hyderabad, chennai, bengaluru
Work from Office
Role & responsibilities Strong experience in data engineering/migration projects in GCP (along with Python) is required and should be able to work independently. Skills: Python, BigQuery, Dataflow, Composer, SQL,PubSub, DataProc etc., Preferred candidate profile Strong experience in data engineering/migration projects in GCP (along with Python) is required and should be able to work independently. Skills: Python, BigQuery, Dataflow, Composer, SQL,PubSub, DataProc etc.,
Posted 4 days ago
3.0 years
0 Lacs
gurugram, haryana, india
On-site
About us: Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an...
Posted 4 days ago
6.0 - 10.0 years
6 - 10 Lacs
mumbai, hyderabad
Work from Office
- Candidate should have solid experience in providing application support and well versed with Incident/problem management. Should be experienced in leading a team and help in building a team equipped with the skills/knowledge to support Cloud enabled applications. - Good understanding around Azure services like ADF, ADLS, SQL, Azure Monitor, Application Insights, Azure functions. Should be comfortable around APIs and their implementation. Should be comfortable in troubleshooting/coordinating for any service issues on Azure. Certification encompassing this is good to have. - Application support will be 24/7 and releases scheduled for weekends so the candidate should be flexible regarding tim...
Posted 4 days ago
6.0 - 8.0 years
4 - 7 Lacs
mumbai, hyderabad
Work from Office
Exposure to machine learning models leveraging scikit-learn and associated libraries Strong skills around Flask/Fast API Ability to write Helm charts to manage complex deployments supporting deployment models like rolling update, canary etc ; Experience with managing parent-child subcharts to optimize deployments Exposure to various python frameworks and ability to assess python model performance bottleneck and optimization techniques Secondary Skills: Understanding of NGINX Load Balancers, Istio Service Mesh, API Gateway is a plus Secondary Skills: Pyspark exposure to handle complex data pipelines is a plus
Posted 4 days ago
5.0 - 10.0 years
5 - 9 Lacs
hyderabad
Hybrid
Notice Period: Immediate . Employment Type: Contract Total: 5+ 4+ years of hands on experience using Azure Cloud, ADLS, ADF & Databricks Finance Domain Data Stewardship Finance Data Reconciliation with SAP down-stream systems Run/Monitor Pipelines/ Validate the Data Bricks note books Able to interface with onsite/ business stake holders. Python, SQL Hands on Knowledge of Snowflake/DW is desirable.
Posted 4 days ago
5.0 years
0 Lacs
pune, maharashtra, india
On-site
Job Title :- Data Engineer - Pyspark Experience: 5 to 8 Years Location: Pune/Hyderabad Job Description Required Skills: 5+ years of experience in Big data and pyspark Must-Have Good work experience on Big Data Platforms like Hadoop, Spark, Scala, Hive, Impala, SQL Good-to-Have Good Spark, Pyspark,Big Data experience Spark UI/Optimization/debugging techniques Good python scripting skills Intermediate SQL exposure – Subquery, Joins, CTE’s Database technologies AWS EMR, S3, IAM, Lambda, SNS, SQS Good work experience on Big Data Platforms like Hadoop, Spark, Scala, Hive, Impala, SQL Experience working on Data Engineering projects Good Understanding of SQL Good understanding of Unix and HDFS comm...
Posted 4 days ago
175.0 years
0 Lacs
gurgaon, haryana, india
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? This role will be part of the Treasury Applications Platform team, we are currently modernizing our platform, migrating i...
Posted 4 days ago
3.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Role: AWS Data Engineer JOB LOCATION : Chennai, Pune EXPERIENCE REQUIREMENT : 5+ Required Technical Skill: Strong Knowledge of Aws Glue/AWS REDSHIFT/SQL/ETL. Good knowledge and experience in Pyspark for forming complex Transformation logic. AWS Data Engineer, SQL,ETL, DWH , Secondary : AWS Glue , Airflow Must-Have Good Knowledge of SQL , ETL A minimum of 3 + years' experience and understanding of Python core concepts, and implementing data pipeline frameworks using PySpark and AWS. Work well independently as well as within a team Good knowledge in working with various AWS services including S3, Glue, DMS, Redshift. Proactive, organized, excellent analytical and problem-solving skills Flexibl...
Posted 4 days ago
5.0 years
0 Lacs
hyderabad, telangana, india
On-site
Role: Data Engineer Experience: 5 Years Location: Hyderabad Timings: 2pm-11pm (WFO) Key Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines for ingesting and processing structured and unstructured retail data. Work with transactional, POS, inventory, and customer datasets specific to the grocery retail domain. Optimize data workflows and improve data quality, governance, and reliability. Collaborate with business analysts, data scientists, and product teams to support reporting and analytical needs. Ensure data availability and accuracy for dashboards, BI tools, and advanced analytics . Implement best practices in data modeling, storage, and query performance optimizati...
Posted 4 days ago
8.0 years
0 Lacs
noida, uttar pradesh, india
On-site
D as provided below • 8+ years of overall IT experience, which includes hands on experience in Big Data technologies. • Mandatory - Hands on experience in Python and PySpark. • Build pySpark applications using Spark Dataframes in Python. • Worked on optimizing spark jobs that processes huge volumes of data. • Hands on experience in version control tools like Git. • Worked on Amazon’s Analytics services like Amazon EMR, Amazon Athena, AWS Glue. • Worked on Amazon’s Compute services like Amazon Lambda, Amazon EC2 and Amazon’s Storage service like S3 and few other services like SNS. • Good to have knowledge of datawarehousing concepts – dimensions, facts, schemas- snowflake, star etc. • Have wo...
Posted 4 days ago
4.0 years
0 Lacs
hyderabad, telangana, india
On-site
· 4+ years of advanced working knowledge of SQL, Python, and PySpark AWS Exposure Python & Pyspark-Candidate should be equally good-Coding is important
Posted 4 days ago
6.0 years
0 Lacs
hyderabad, telangana, india
On-site
Job Title: AWS Data Engineer Location: Hyderabad (Flexible) Experience Level: 6+ years Essential Requirements 6–7 years of IT experience with minimum 3 years in Data Engineering / Data Architecture (data lakes, ingestion, extraction, transformation). Strong hands-on expertise in AWS cloud technologies including: AWS Lambda, S3, DynamoDB, EC2, ECS, ECR, CloudFormation AWS Glue, Glue Catalog, Athena, DMS Proficiency in Python and PySpark for data processing and automation. Solid experience with SQL and PL/SQL concepts. Experience in data ingestion, transformation, and orchestration pipelines. Hands-on experience in building data pipelines and applications to stream and process large datasets a...
Posted 4 days ago
8.0 years
0 Lacs
indore, madhya pradesh, india
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire AWS SageMaker Professionals in the following areas : Experience 8+ Years Job Description Key Responsibilities: Data Engineering Data LakeHouse: Simplify analytics and ...
Posted 4 days ago
0 years
0 Lacs
mumbai, maharashtra, india
On-site
Strong Experience in PySpark o Hands-on expertise in building scalable data pipelines using PySpark. o Proficiency in using Spark SQL, DataFrame, and RDD APIs to implement complex business logic. Proficient Programming Skills o Solid coding skills in Python (preferred), with strong fundamentals in data structures, algorithms, and software engineering principles. PUBLIC Data Pipeline Development o Proven experience designing, developing, and maintaining batch and streaming data pipelines. o Understanding of ETL/ELT processes and best practices for data transformation, data quality, and performance optimization. Knowledge of Modern Data Engineering Ecosystem o Familiarity with the curren...
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
gurugram, haryana, india
On-site
About Company: Our Client Corporation provides digital engineering and technology services to Forbes Global 2000 companies worldwide. Our Engineering First approach ensures we can execute all ideas and creatively solve pressing business challenges. With industry expertise and empowered agile teams, we prioritize execution early in the process for impactful results. We combine logic, creativity and curiosity to build, solve, and create. Every day, we help clients engage with new technology paradigms, creatively building solutions that solve their most pressing business challenges and move them to the forefront of their industry. Job Title : GCP Big Data Engineer Key Skills : PySpark , Airflow...
Posted 4 days ago
5.0 years
0 Lacs
gurugram, haryana, india
On-site
This role is for one of the Weekday's clients Min Experience: 5 years Location: Gurgaon JobType: full-time We are seeking an experienced Senior Data Engineer with strong expertise in building and managing large-scale data pipelines within the AWS ecosystem. The ideal candidate will have a solid background in SQL, cloud-native data platforms, and orchestration frameworks, with a deep understanding of scalable data lake and warehouse architectures. Requirements Key Responsibilities Design, develop, and maintain robust, scalable data pipelines and ETL/ELT workflows. Leverage the AWS Data Stack (S3, Glue, Kinesis, Redshift, Data Lake) for data ingestion, transformation, and storage. Write, optim...
Posted 4 days ago
0 years
0 Lacs
chennai, tamil nadu, india
On-site
30 days to immediate Job description: Proactive Q2 HadoopSpark SparkSQL PythonP4 Job Description Mandatory Certificate Databricks Certified Developer Apache Spark 3 0 Skills Databricks Skills Python or Java PySpark Spark Spark SQL ETL Hadoop Databricks certification Responsibilities Ensure effective Design Development Validation and Support activities in line with client needs and architectural requirements Ensure continual knowledge management Adherence to the organizational guidelines and processes As part of the delivery team your primary role would be to ensure effective Design Development Validation and Support activities to assure that our clients are satisfied with the high levels of ...
Posted 4 days ago
4.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Open Location - Indore, Noida, Gurgaon, Bangalore, Hyderabad, Pune Immediate Joiners are preferred. Qualification 4 years of good hands-on exposure with Big Data technologies – pySpark (Data frame and SparkSQL), Hadoop, and Hive Good hands-on experience of python and Bash Scripts Good understanding of SQL and data warehouse concepts Strong analytical, problem-solving, data analysis and research skills Demonstrable ability to think outside of the box and not be dependent on readily available tools Excellent communication, presentation and interpersonal skills are a must Hands-on experience with using Cloud Platform provided Big Data technologies Orchestration with Airflow and Any job schedule...
Posted 4 days ago
8.0 years
0 Lacs
hyderabad, telangana, india
Remote
Job Description: Azure Data Engineer Location: Pan India (Hybrid) Experience: 5–8 Years (STRICTLY) Employment Type: Permanent Notice Period: Immediate Joiners Only CTC: Up to 21 LPA About the Company Our client is a trusted global innovator of IT and business services, present in 50+ countries. They specialize in digital & IT modernization, consulting, managed services, and industry-specific solutions. With a commitment to long-term success, they empower clients and society to move confidently into the digital future. Roles & Responsibilities:- Design, build, and maintain scalable data pipelines using Azure Data Factory (ADF), Databricks, Synapse, and Azure Data Lake, Service Now, Power BI, ...
Posted 4 days ago
5.0 years
0 Lacs
pune, maharashtra, india
On-site
Hi All, We are hiring for a Permanent role at LTI Mindtree ! Role: GCP Data Engineer Experience: 5+ Years Location: PAN India Notice Period: Immediate to 30 Days 📩 Interested candidates can share their updated CVs at Neha.Ingale@alphacom.in Job Description We are looking for an experienced GCP Data Engineer with strong expertise in Google Cloud Platform services. The ideal candidate should have hands-on experience with GCP data stack components such as BigQuery, DataFlow, DataProc, Cloud Composer, PySpark, and Python. Knowledge of Tableau or MSTR is mandatory. This is a great opportunity to work on innovative data engineering solutions in a fast-paced, collaborative environment. Key Responsi...
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform Engineer, your role will involve designing, building, and maintaining scalable data infrastructure using AWS cloud services. You will utilize your expertise in Python, PySpark, EMR, and Apache Airflow to develop robust data pipelines and analytics solutions that drive business insights. Key Responsibilities: - Design and implement scalable data pipelines using Apache Airflow - Build and optimize AWS EMR clusters for big data processing - Develop data processing applications using Python and PySpark - Create ETL workflows for data ingestion and transformation - Monitor and troubleshoot data platform performance - Collaborate with data scientists and analysts on data require...
Posted 4 days ago
6.0 years
0 Lacs
gurgaon, haryana, india
On-site
Who You'll Work With Driving lasting impact and building long-term capabilities with our clients is not easy work. You are the kind of person who thrives in a high performance/high reward culture - doing hard things, picking yourself up when you stumble, and having the resilience to try another way forward. In return for your drive, determination, and curiosity, we'll provide the resources, mentorship, and opportunities you need to become a stronger leader faster than you ever thought possible. Your colleagues—at all levels—will invest deeply in your development, just as much as they invest in delivering exceptional results for clients. Every day, you'll receive apprenticeship, coaching, and...
Posted 4 days ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
Role Overview: As an AWS Senior Data Engineer, you will be responsible for designing and maintaining scalable ETL pipelines using AWS Glue, PySpark & Python. You will build and manage workflows with AWS services such as Glue, Lambda, S3, Athena, DynamoDB, and Step Functions. Your role will involve writing and optimizing SQL queries for complex data transformations, implementing Infrastructure as Code (IaC) using Terraform, and ensuring data quality, performance, and reliability at scale. Collaboration with cross-functional teams, troubleshooting, optimization of pipelines, and maintaining clear documentation are key aspects of this role. Key Responsibilities: - Design and maintain scalable E...
Posted 4 days ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
88025 Jobs | Dublin
Wipro
30699 Jobs | Bengaluru
Accenture in India
25586 Jobs | Dublin 2
EY
23713 Jobs | London
Uplers
19531 Jobs | Ahmedabad
IBM
15625 Jobs | Armonk
Bajaj Finserv
15600 Jobs |
Amazon.com
15164 Jobs |
Capgemini
14629 Jobs | Paris,France
Accenture services Pvt Ltd
14230 Jobs |