Home
Jobs
Companies
Resume

17 Azure Adf Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Roles & Responsibilities: Resource must have 5+ years of hands on experience in Azure Cloud development (ADF + DataBricks) - mandatory Strong in Azure SQL and good to have knowledge on Synapse / Analytics Experience in working on Agile Project and familiar with Scrum/SAFe ceremonies. Good communication skills - Written & Verbal Can work directly with customer Ready to work in 2nd shift Good in communication and flexible Defines, designs, develops and test software components/applications using Microsoft Azure- Data-bricks, ADF, ADL, Hive, Python, Data bricks, SparkSql, PySpark. Expertise in Azure Data Bricks, ADF, ADL, Hive, Python, Spark, PySpark Strong T-SQL skills with experience in Azure SQL DW Experience handling Structured and unstructured datasets Experience in Data Modeling and Advanced SQL techniques Experience implementing Azure Data Factory Pipelines using latest technologies and techniques. Good exposure in Application Development. The candidate should work independently with minimal supervision

Posted 1 week ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

JOB DESCRIPTION We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server- side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy- first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities: Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms , ensuring real-time and accurate server-side event tracking. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Develop and manage custom tracking solutions leveraging Azure Clean Rooms , ensuring user NFAs are respected and privacy-compliant logic is implemented. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS) , Cosmos DB , and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager , including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azures data security standards and enterprise privacy frameworks. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. EXPERTISE AND QUALIFICATIONS Required Skills: Strong hands-on experience with Fabric and building scalable APIs. Experience in implementing Meta CAPI , Google Enhanced Conversions , and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms , with experience developing custom logic and code for clean data collaborations . Proficiency with Azure Cloud technologies , especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS , and Azure security best practices . Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management , specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills: Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes .

Posted 1 week ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Consultant - Software Engineer (with C#) JOB DESCRIPTION: We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server- side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy- first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities: Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms , ensuring real-time and accurate server-side event tracking. Develop and manage custom tracking solutions leveraging Azure Clean Rooms , ensuring user NFAs are respected and privacy-compliant logic is implemented. Architect and develop secure REST APIs in C# to support advanced attribution models and marketing analytics pipelines. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS) , Cosmos DB , and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager , including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azures data security standards and enterprise privacy frameworks. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. EXPERTISE AND QUALIFICATIONS Required Skills: Strong hands-on experience with C# and building scalable APIs. Experience in implementing Meta CAPI , Google Enhanced Conversions , and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms , with experience developing custom logic and code for clean data collaborations . Proficiency with Azure Cloud technologies , especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS , and Azure security best practices . Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Role & responsibilities

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Hyderabad, Delhi / NCR

Hybrid

Naukri logo

Develop ETL pipelines using SQL and C# and Python Performance Tuning, Design scalable DB architecture Maintain Technical documentation Required Candidate profile 5+ years in SQL T-SQL, performance tuning, developing ETL processes. Hands on C# and WPF will be a plus Experience in AWS and Azure is must

Posted 1 week ago

Apply

4 - 8 years

15 - 30 Lacs

Chandigarh

Work from Office

Naukri logo

Responsibilities Design new functionalities and solutions to problems Implement back-end services and user-facing features in Java Participate in development scope planning, issue prioritization, and code reviews Contribute to all aspects of SaaS product development from requirements analysis to product release Collaborate with cross-functional teams to develop implementation plans with a focus on innovation, quality, and sustainability Learn and adopt cutting-edge technologies and tools for building enterprise SaaS solutions Ensure the delivery of high-quality enterprise offerings to schedules Requirements Minimum of 8-12 years of experience in the software industry Hands-on experience in Java and Microservices (Spring Boot) Experience with Microservices and REST API Familiarity with AWS services such as ECS and S3 Knowledge of Java 8, JUnit, and Spring Boot Experience in end-to-end development projects from scratch Excellent problem-solving and debugging skills Strong communication skills for global team interaction In-depth knowledge of Enterprise Design Patterns Ability to learn and apply new technologies in development Strong problem-solving mindset Confidence and resourcefulness in the face of challenges Kerela candidates ONLY

Posted 2 months ago

Apply

4 - 8 years

15 - 30 Lacs

Chennai

Work from Office

Naukri logo

About the role : We are looking for a Machine Learning Engineer who will work on a broad range of cutting-edge data analytics and machine learning problems across a variety of industries. More specifically, you will Collaborate with data scientists to transition machine learning models from development to production. Design and implement MLOps pipelines for model training, deployment, and monitoring on GCP. Utilize GCP services for version control, continuous integration, and continuous deployment (CI/CD) of machine learning models. Implement monitoring and logging solutions to track the performance and health of deployed models. Optimize and scale machine learning workflows on GCP to handle production workloads efficiently. Stay updated on the latest developments in MLOps practices and GCP services to ensure best practices are followed. Desired Skills and Experience : 3+ years of experience with at least 3+ years of relevant DS experience. Good working knowledge on GCP Proficient in a structured Python Follows good software engineering practices and has an interest in building reliable and robust software. Good knowledge of DS concepts and professional experience in developing and enhancing algorithms and models to solve business problem Conducting quantitative analyses and interpreting results Working knowledge of Linux or Unix environments ideally in a cloud environment. Working knowledge of Spark/PySpark is desirable. Excellent written and verbal communication skills. B.Tech from Tier-1 college M.S or M. Tech is preferred.

Posted 2 months ago

Apply

4 - 8 years

15 - 30 Lacs

Bareilly

Work from Office

Naukri logo

Responsibilities Design new functionalities and solutions to problems Implement back-end services and user-facing features in Java Participate in development scope planning, issue prioritization, and code reviews Contribute to all aspects of SaaS product development from requirements analysis to product release Collaborate with cross-functional teams to develop implementation plans with a focus on innovation, quality, and sustainability Learn and adopt cutting-edge technologies and tools for building enterprise SaaS solutions Ensure the delivery of high-quality enterprise offerings to schedules Requirements Minimum of 8-12 years of experience in the software industry Hands-on experience in Java and Microservices (Spring Boot) Experience with Microservices and REST API Familiarity with AWS services such as ECS and S3 Knowledge of Java 8, JUnit, and Spring Boot Experience in end-to-end development projects from scratch Excellent problem-solving and debugging skills Strong communication skills for global team interaction In-depth knowledge of Enterprise Design Patterns Ability to learn and apply new technologies in development Strong problem-solving mindset Confidence and resourcefulness in the face of challenges Kerela candidates ONLY

Posted 2 months ago

Apply

4 - 8 years

15 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

About the role : We are looking for a Machine Learning Engineer who will work on a broad range of cutting-edge data analytics and machine learning problems across a variety of industries. More specifically, you will Collaborate with data scientists to transition machine learning models from development to production. Design and implement MLOps pipelines for model training, deployment, and monitoring on GCP. Utilize GCP services for version control, continuous integration, and continuous deployment (CI/CD) of machine learning models. Implement monitoring and logging solutions to track the performance and health of deployed models. Optimize and scale machine learning workflows on GCP to handle production workloads efficiently. Stay updated on the latest developments in MLOps practices and GCP services to ensure best practices are followed. Desired Skills and Experience : 3+ years of experience with at least 3+ years of relevant DS experience. Good working knowledge on GCP Proficient in a structured Python Follows good software engineering practices and has an interest in building reliable and robust software. Good knowledge of DS concepts and professional experience in developing and enhancing algorithms and models to solve business problem Conducting quantitative analyses and interpreting results Working knowledge of Linux or Unix environments ideally in a cloud environment. Working knowledge of Spark/PySpark is desirable. Excellent written and verbal communication skills. B.Tech from Tier-1 college M.S or M. Tech is preferred.

Posted 2 months ago

Apply

4 - 8 years

15 - 30 Lacs

Chennai

Work from Office

Naukri logo

About the role : We are looking for a Machine Learning Engineer who will work on a broad range of cutting-edge data analytics and machine learning problems across a variety of industries. More specifically, you will Collaborate with data scientists to transition machine learning models from development to production. Design and implement MLOps pipelines for model training, deployment, and monitoring on GCP. Utilize GCP services for version control, continuous integration, and continuous deployment (CI/CD) of machine learning models. Implement monitoring and logging solutions to track the performance and health of deployed models. Optimize and scale machine learning workflows on GCP to handle production workloads efficiently. Stay updated on the latest developments in MLOps practices and GCP services to ensure best practices are followed. Desired Skills and Experience : 3+ years of experience with at least 3+ years of relevant DS experience. Good working knowledge on GCP Proficient in a structured Python Follows good software engineering practices and has an interest in building reliable and robust software. Good knowledge of DS concepts and professional experience in developing and enhancing algorithms and models to solve business problem Conducting quantitative analyses and interpreting results Working knowledge of Linux or Unix environments ideally in a cloud environment. Working knowledge of Spark/PySpark is desirable. Excellent written and verbal communication skills. B.Tech from Tier-1 college M.S or M. Tech is preferred.

Posted 2 months ago

Apply

4 - 8 years

15 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

About the role : We are looking for a Machine Learning Engineer who will work on a broad range of cutting-edge data analytics and machine learning problems across a variety of industries. More specifically, you will Collaborate with data scientists to transition machine learning models from development to production. Design and implement MLOps pipelines for model training, deployment, and monitoring on GCP. Utilize GCP services for version control, continuous integration, and continuous deployment (CI/CD) of machine learning models. Implement monitoring and logging solutions to track the performance and health of deployed models. Optimize and scale machine learning workflows on GCP to handle production workloads efficiently. Stay updated on the latest developments in MLOps practices and GCP services to ensure best practices are followed. Desired Skills and Experience : 3+ years of experience with at least 3+ years of relevant DS experience. Good working knowledge on GCP Proficient in a structured Python Follows good software engineering practices and has an interest in building reliable and robust software. Good knowledge of DS concepts and professional experience in developing and enhancing algorithms and models to solve business problem Conducting quantitative analyses and interpreting results Working knowledge of Linux or Unix environments ideally in a cloud environment. Working knowledge of Spark/PySpark is desirable. Excellent written and verbal communication skills. B.Tech from Tier-1 college M.S or M. Tech is preferred.

Posted 2 months ago

Apply

4 - 8 years

15 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities Design new functionalities and solutions to problems Implement back-end services and user-facing features in Java Participate in development scope planning, issue prioritization, and code reviews Contribute to all aspects of SaaS product development from requirements analysis to product release Collaborate with cross-functional teams to develop implementation plans with a focus on innovation, quality, and sustainability Learn and adopt cutting-edge technologies and tools for building enterprise SaaS solutions Ensure the delivery of high-quality enterprise offerings to schedules Requirements Minimum of 8-12 years of experience in the software industry Hands-on experience in Java and Microservices (Spring Boot) Experience with Microservices and REST API Familiarity with AWS services such as ECS and S3 Knowledge of Java 8, JUnit, and Spring Boot Experience in end-to-end development projects from scratch Excellent problem-solving and debugging skills Strong communication skills for global team interaction In-depth knowledge of Enterprise Design Patterns Ability to learn and apply new technologies in development Strong problem-solving mindset Confidence and resourcefulness in the face of challenges Kerela candidates ONLY

Posted 2 months ago

Apply

4 - 8 years

15 - 30 Lacs

Chennai

Work from Office

Naukri logo

Responsibilities Design new functionalities and solutions to problems Implement back-end services and user-facing features in Java Participate in development scope planning, issue prioritization, and code reviews Contribute to all aspects of SaaS product development from requirements analysis to product release Collaborate with cross-functional teams to develop implementation plans with a focus on innovation, quality, and sustainability Learn and adopt cutting-edge technologies and tools for building enterprise SaaS solutions Ensure the delivery of high-quality enterprise offerings to schedules Requirements Minimum of 8-12 years of experience in the software industry Hands-on experience in Java and Microservices (Spring Boot) Experience with Microservices and REST API Familiarity with AWS services such as ECS and S3 Knowledge of Java 8, JUnit, and Spring Boot Experience in end-to-end development projects from scratch Excellent problem-solving and debugging skills Strong communication skills for global team interaction In-depth knowledge of Enterprise Design Patterns Ability to learn and apply new technologies in development Strong problem-solving mindset Confidence and resourcefulness in the face of challenges Kerela candidates ONLY

Posted 2 months ago

Apply

4 - 8 years

15 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities Design new functionalities and solutions to problems Implement back-end services and user-facing features in Java Participate in development scope planning, issue prioritization, and code reviews Contribute to all aspects of SaaS product development from requirements analysis to product release Collaborate with cross-functional teams to develop implementation plans with a focus on innovation, quality, and sustainability Learn and adopt cutting-edge technologies and tools for building enterprise SaaS solutions Ensure the delivery of high-quality enterprise offerings to schedules Requirements Minimum of 8-12 years of experience in the software industry Hands-on experience in Java and Microservices (Spring Boot) Experience with Microservices and REST API Familiarity with AWS services such as ECS and S3 Knowledge of Java 8, JUnit, and Spring Boot Experience in end-to-end development projects from scratch Excellent problem-solving and debugging skills Strong communication skills for global team interaction In-depth knowledge of Enterprise Design Patterns Ability to learn and apply new technologies in development Strong problem-solving mindset Confidence and resourcefulness in the face of challenges Kerela candidates ONLY

Posted 2 months ago

Apply

8 - 10 years

0 - 3 Lacs

Mumbai

Hybrid

Naukri logo

Desired Candidate Profile Strong understanding of cloud-based technologies including Azure Data bricks, Delta Lake, Azure ADF, Azure Synapse Analytics. 8-10 years of experience in software development with expertise in Python programming language. Experience working on big data processing projects involving large datasets.

Posted 3 months ago

Apply

8 - 10 years

0 - 3 Lacs

Bangalore Rural

Hybrid

Naukri logo

Desired Candidate Profile Strong understanding of cloud-based technologies including Azure Data bricks, Delta Lake, Azure ADF, Azure Synapse Analytics. 8-10 years of experience in software development with expertise in Python programming language. Experience working on big data processing projects involving large datasets.

Posted 3 months ago

Apply

8 - 10 years

0 - 3 Lacs

Bengaluru

Hybrid

Naukri logo

Desired Candidate Profile Strong understanding of cloud-based technologies including Azure Data bricks, Delta Lake, Azure ADF, Azure Synapse Analytics. 8-10 years of experience in software development with expertise in Python programming language. Experience working on big data processing projects involving large datasets.

Posted 3 months ago

Apply

12 - 14 years

30 - 40 Lacs

Pune

Work from Office

Naukri logo

10+ years of experience in data architecture with a strong foundation in atleast one on-premiese ETL tool and Cloud based ETL tools - Must have played Data architect role for at least 4+ years - Hands-on experience with Informatica, Datastage, SSIS, Talend - Hands-on experience with Azure ADF or Databricks, PySpark, and Python. - Minimum of 4 years of hands-on expertise in Spark, including Spark job performance optimization techniques. - Minimum of 6 years of hands-on involvement with Azure Cloud - Hands on experience in Azure Batch, Azure Function, Storage account, KeyVault, Snowflake/Synapse, SQLMI, Azure Monitor - Practical hands-on experience with the following areas of DW/BI: BI, Data Quality, Master & Metadata Management, Data Governance, Big Data, Performance Tuning, infrastructure aspects. - Proficiency in crafting high-level designs for data warehousing solutions on Azure cloud. - Proven track record of implementing big-data solutions within the Azure ecosystem including Data Lakes. - Familiarity with data warehousing, data quality assurance, and monitoring practices. - Demonstrated capability in constructing scalable data pipelines and ETL processes. - Proficiency in testing methodologies and validating data pipelines. - Experience with or working knowledge of DevOps environments. - Practical experience in Data security services. - Understanding of data modeling, integration, and design principles. - Strong communication and analytical skills. - A dedicated team player with a goal-oriented mindset, committed to delivering quality work with attention to detail.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies