Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
12 - 20 Lacs
kolkata
Work from Office
Responsibilities: Design, implement, manage and optimize data pipelines in Azure Data Factory as per customer business requirements. Design and develop SparkSQL/PySpark codes in DataBricks. Integrate different services of Azure and External Systems to implement data analytics solutions. Design and develop codes in Azure LogicApp, Azure Functions, Azure SQL, Synapse etc. • Implement best practices in ADF / DataBricks / other Azure data engineering services / target databases to maximize job performance, ensure code reusability and minimize implementation and maintenance cost. Ingest structured/semi-structures/unstructured data into ADLS/Blob Storage in batch/near real time/real time from diff...
Posted 1 month ago
4.0 - 8.0 years
12 - 20 Lacs
bengaluru
Work from Office
Responsibilities: Design, implement, manage and optimize data pipelines in Azure Data Factory as per customer business requirements. Design and develop SparkSQL/PySpark codes in DataBricks. Integrate different services of Azure and External Systems to implement data analytics solutions. Design and develop codes in Azure LogicApp, Azure Functions, Azure SQL, Synapse etc. • Implement best practices in ADF / DataBricks / other Azure data engineering services / target databases to maximize job performance, ensure code reusability and minimize implementation and maintenance cost. Ingest structured/semi-structures/unstructured data into ADLS/Blob Storage in batch/near real time/real time from diff...
Posted 1 month ago
5.0 - 9.0 years
10 - 17 Lacs
noida
Work from Office
Azure data engineer Notice Period : Max 30 Days hands-on data engineers who can produce beautiful & functional code to solve complex analytics problems Extensive experience on big data platforms including Hadoop, Spark, Hive, Presto, Kudu. • Hands-on experience in design and development of hybrid cloud architecture • Extensive experience in at least one of the cloud platforms among Azure (Azure Data Factory, Databricks, Sybase Analytics, Azure functions, Azure stream analytics) OR AWS (S3, lambda, glue, Quick sight, EMR,EC2,Redshift ,Athena and Presto) • Hands-on experience in developing large scale distributed applications in either java or python programming language • Hands-on experience ...
Posted 1 month ago
3.0 - 8.0 years
14 - 24 Lacs
chennai
Hybrid
Role & responsibilities Design, develop, and maintain scalable data pipelines using Azure data services such as Azure Data Factory, Azure Databricks, and Apache Spark. Implement efficient Extract, Transform, Load (ETL) processes to move and transform data across various sources. Knowledge about data warehousing concepts Extensive working experience in Azure Databricks will be preferable. Utilize Azure SQL Database, Azure Blob Storage, Azure Data Lake Storage, and other Azure data services to store and retrieve data. Performance optimization and trouble shooting capabilities Technology: SQL, ADF, ADL Preferred candidate profile Perks and benefits
Posted 1 month ago
3.0 - 8.0 years
14 - 24 Lacs
pune
Hybrid
Role & responsibilities Design, develop, and maintain scalable data pipelines using Azure data services such as Azure Data Factory, Azure Databricks, and Apache Spark. Implement efficient Extract, Transform, Load (ETL) processes to move and transform data across various sources. Knowledge about data warehousing concepts Extensive working experience in Azure Databricks will be preferable. Utilize Azure SQL Database, Azure Blob Storage, Azure Data Lake Storage, and other Azure data services to store and retrieve data. Performance optimization and trouble shooting capabilities Technology: SQL, ADF, ADL Preferred candidate profile Perks and benefits
Posted 1 month ago
3.0 - 8.0 years
14 - 24 Lacs
bengaluru
Hybrid
Role & responsibilities Design, develop, and maintain scalable data pipelines using Azure data services such as Azure Data Factory, Azure Databricks, and Apache Spark. Implement efficient Extract, Transform, Load (ETL) processes to move and transform data across various sources. Knowledge about data warehousing concepts Extensive working experience in Azure Databricks will be preferable. Utilize Azure SQL Database, Azure Blob Storage, Azure Data Lake Storage, and other Azure data services to store and retrieve data. Performance optimization and trouble shooting capabilities Technology: SQL, ADF, ADL Preferred candidate profile Perks and benefits
Posted 1 month ago
4.0 - 8.0 years
11 - 20 Lacs
kolkata
Work from Office
Desired profile: 1. Required years of experience : 3-5 years 2. Engage in design, development and deployment of data integration solutions employing a three-tiered ETL methodology. 3. Utilize Python and Py-Spark to extract data from diverse sources, execute transformative operations including filtering and joining, and dispatch processed data to designated destinations. 4. Understanding and experience with Azure Databricks will be preferred. Knowledge of advanced Python programming concepts is compulsory
Posted 1 month ago
4.0 - 8.0 years
11 - 20 Lacs
gurugram
Work from Office
Desired profile: 1. Required years of experience : 3-5 years 2. Engage in design, development and deployment of data integration solutions employing a three-tiered ETL methodology. 3. Utilize Python and Py-Spark to extract data from diverse sources, execute transformative operations including filtering and joining, and dispatch processed data to designated destinations. 4. Understanding and experience with Azure Databricks will be preferred. Knowledge of advanced Python programming concepts is compulsory
Posted 1 month ago
5.0 - 8.0 years
10 - 15 Lacs
bengaluru
Work from Office
Key Responsibilities: Develop and implement data pipelines using Azure Data Factory and Databricks. Work with stakeholders to gather requirements and translate them into technical solutions. Migrate data from Oracle to Azure Data Lake. Optimize data processing workflows for performance and scalability. Ensure data quality and integrity throughout the data lifecycle. Collaborate with data architects and other team members to design and implement data solutions. Required Skills: Strong experience with Azure Data Services, including Azure Data Factory, Synapse Analytics, and Databricks. Proficiency in data transformation and ETL processes. Hands-on experience with Oracle to Azure Data Lake migr...
Posted 1 month ago
7.0 - 10.0 years
16 - 30 Lacs
kolkata
Work from Office
Azure Data Engineer - Manager (Databricks+Pyspark) Experience 7 to 10 years with at least 3 project lifecycles ( BE/BTech/MTech) Roles and responsibilities: - Generate insight from data through transformation - Processing, cleansing, and verifying the integrity of data used for analysis - Work independently and lead a team for E2E delivery - Participate and lead, when needed, the project meetings with the customer - Manage multiple development designs and projects to meet project and customer required timelines. - Provide Customer Training as required - Participate in internal projects as required - Hands on experience with data models and data warehousing technologies. - Candidates must be ...
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
bengaluru
Hybrid
Minimum experience of working in projects as a Senior Azure Data Engineer. B.Tech/B.E degree in Computer Science or Information Technology. Experience with enterprise integration tools and ETL (extract, transform, load) tools like Data Bricks, Azure Data factory, and Talend/Informatica, etc. Analyzing Data using python, Spark streaming, SSIS/Informatica Batch ETL and data base tools like SQL, Mongo DB for processing data from different sources. Experience with Platform Automation tools (DB management, Azure, Jenkins, GitHub) will be an added advantage. Design, operate, and integrate Different systems to enable efficiencies in key areas of the business Understand Business Requirements, Intera...
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
noida
Hybrid
Minimum experience of working in projects as a Senior Azure Data Engineer. B.Tech/B.E degree in Computer Science or Information Technology. Experience with enterprise integration tools and ETL (extract, transform, load) tools like Data Bricks, Azure Data factory, and Talend/Informatica, etc. Analyzing Data using python, Spark streaming, SSIS/Informatica Batch ETL and data base tools like SQL, Mongo DB for processing data from different sources. Experience with Platform Automation tools (DB management, Azure, Jenkins, GitHub) will be an added advantage. Design, operate, and integrate Different systems to enable efficiencies in key areas of the business Understand Business Requirements, Intera...
Posted 1 month ago
4.0 - 7.0 years
12 - 19 Lacs
gurugram
Hybrid
Minimum experience of working in projects as an Azure Data Engineer. B.Tech/B.E degree in Computer Science or Information Technology. Experience with enterprise integration tools and ETL (extract, transform, load) tools like Data Bricks, Azure Data factory, and Talend/Informatica, etc. Analyzing Data using python, Spark streaming, SSIS/Informatica Batch ETL and data base tools like SQL, Mongo DB for processing data from different sources. Experience with Platform Automation tools (DB management, Azure, Jenkins, GitHub) will be an added advantage. Design, operate, and integrate Different systems to enable efficiencies in key areas of the business Understand Business Requirements, Interacting ...
Posted 1 month ago
7.0 - 12.0 years
17 - 30 Lacs
bengaluru
Work from Office
Job Title: Azure Data Architect Experience: More than 7 years Location: Pan India Employment Type: Full-Time Technology: SQL, ADF, ADLS, Synapse, Pyspark, Databricks, data modelling Key Responsibilities: Requirement gathering and analysis Design of data architecture and data model to ingest data Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines ...
Posted 1 month ago
10.0 - 17.0 years
30 - 40 Lacs
hyderabad, pune, bengaluru
Hybrid
Job Title: Azure Data Architect Location: [Insert Location] Employment Type: Full-Time Department: Data & Analytics / Cloud Engineering Role Overview We are seeking a seasoned Azure Data Architect with over 12 years of IT experience and a strong background in designing and implementing large-scale Azure data platforms. This role demands deep technical expertise, strategic thinking, and the ability to lead enterprise-grade data initiatives across diverse teams and stakeholders. Key Responsibilities Architect and implement scalable data solutions using Azure services: Data Factory, Data Lake Gen2, Azure SQL, Analysis Services, Databricks, and HDInsight. Design and develop enterprise frameworks...
Posted 1 month ago
7.0 - 11.0 years
20 - 27 Lacs
hyderabad, pune, bengaluru
Hybrid
Roles and Responsibilities Key Responsibilities: • Data Pipeline Development: Lead the design, development, and deployment of data pipelines using Azure OneLake, Azure Data Factory, and Apache Spark, ensuring efficient, scalable, and secure data movement across systems. • ETL Architecture: Architect and implement ETL (Extract, Transform, Load) workflows, optimizing the process for data ingestion, transformation, and storage in the cloud. • Data Integration: Build and manage data integration solutions that connect multiple data sources (structured and unstructured) into a cohesive data ecosystem. Use SQL, Python, Scala, and R to manipulate and process large datasets. • Azure OneLake Expertise...
Posted 1 month ago
4.0 - 5.0 years
3 - 6 Lacs
agra
Work from Office
Job Title : Databricks Developer (Contract) Contract Duration : 4 Months +Exendible based on Performance Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 4+ Years Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow o...
Posted 1 month ago
4.0 - 5.0 years
3 - 6 Lacs
chennai
Work from Office
Job Title : Databricks Developer (Contract) Contract Duration : 4 Months +Exendible based on Performance Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 4+ Years Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow o...
Posted 1 month ago
4.0 - 5.0 years
3 - 6 Lacs
kanpur
Work from Office
Job Title : Databricks Developer (Contract) Contract Duration : 4 Months +Exendible based on Performance Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 4+ Years Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow o...
Posted 1 month ago
7.0 - 8.0 years
8 - 12 Lacs
ludhiana
Work from Office
Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productio...
Posted 1 month ago
7.0 - 8.0 years
8 - 12 Lacs
lucknow
Work from Office
Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productio...
Posted 1 month ago
7.0 - 8.0 years
8 - 12 Lacs
hyderabad
Work from Office
Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productio...
Posted 1 month ago
6.0 - 10.0 years
20 - 30 Lacs
bengaluru
Remote
Role & responsibilities 1. Platform Architecture & Automation Design and maintain core Azure infrastructure using IaC tools (Terraform, Bicep, ARM) covering VNets, subnets, Private Link, App Services, ADLS, Data Factory, Databricks and Cosmos DB. Build and manage CI/CD pipelines via Azure DevOps or GitHub Actions to enforce compliance, Git-based version control for notebook/workflow deployment, and automated provisioning of infrastructure. Automate operational tasks and governance with scripting (PowerShell, Python, Bash). 2. Git & Source Control Maintain Git version control for infrastructure code, Databricks notebooks, and deployment workflows across environments (dev QA production). Defin...
Posted 1 month ago
4.0 - 5.0 years
3 - 7 Lacs
thane
Remote
Contract Duration : 4 Months +Exendible based on Performance Job Timings : India Evening Shift (till 11 : 30 PM IST) Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse ...
Posted 1 month ago
4.0 - 5.0 years
3 - 7 Lacs
mumbai
Remote
Contract Duration : 4 Months +Exendible based on Performance Job Timings : India Evening Shift (till 11 : 30 PM IST) Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse ...
Posted 1 month ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
192783 Jobs | Dublin
Wipro
61786 Jobs | Bengaluru
EY
49321 Jobs | London
Accenture in India
40642 Jobs | Dublin 2
Turing
35027 Jobs | San Francisco
Uplers
31887 Jobs | Ahmedabad
IBM
29626 Jobs | Armonk
Capgemini
26439 Jobs | Paris,France
Accenture services Pvt Ltd
25841 Jobs |
Infosys
25077 Jobs | Bangalore,Karnataka