Jobs
Interviews

15 Azure Adf Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

The metrics insights and analytics team at LTIMindtree is looking for a talented individual to join them in building dashboards and analytical solutions using AI ML based on business requirements. Your role will involve providing predictive and prescriptive analytics based on various delivery execution parameters, as well as offering actionable insights to users. You will be responsible for automating processes using new-age machine learning algorithms. Key Responsibilities: - Conceptualize, maintain, and automate dashboards as per the requirements. - Automation of existing processes to enhance productivity and time to market. - Enable decision making and action plan identification through metrics analytics. - Conduct training sessions and presentations. - Collaborate with various stakeholders to understand business problems and provide solutions. - Introduce new age solutions and techniques into the way of working. Skills Required: - Minimum 2-5 years of experience with power BI dashboards/TABLEAU and python. - Minimum 2-5 years of experience in AI/ML development. - Strong analytical skills with a focus on solutioning and problem-solving, along with an inclination towards numbers. - Experience working with Text analytics and NLP. - Proficiency in data cleansing, pre-processing data, and exploratory data analysis. - Knowledge of Azure ADF, Excel MACRO, and RPA will be advantageous. - Ability to perform feature engineering, normalize data, and build correlation maps. - Proficient in SQL. - Hands-on experience in model operationalization and pipeline management. - Capability to work effectively with global teams. - Good presentation and training skills. About LTIMindtree: LTIMindtree is a global technology consulting and digital solutions company that helps enterprises across industries to reimagine business models, accelerate innovation, and maximize growth through digital technologies. With a team of nearly 90,000 professionals in over 30 countries, LTIMindtree combines domain and technology expertise to drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Visit www.ltimindtree.com for more information.,

Posted 2 days ago

Apply

13.0 - 17.0 years

0 Lacs

maharashtra

On-site

Birlasoft is a powerhouse that brings together domain expertise, enterprise solutions, and digital technologies to redefine business processes. With a consultative and design thinking approach, we drive societal progress by enabling our customers to run businesses with efficiency and innovation. As part of the CK Birla Group, a multibillion-dollar enterprise, we have a team of 12,500+ professionals dedicated to upholding the Group's 162-year legacy. Our core values prioritize Diversity, Equity, and Inclusion (DEI) initiatives, along with Corporate Sustainable Responsibility (CSR) activities, demonstrating our commitment to building inclusive and sustainable communities. Join us in shaping a future where technology seamlessly aligns with purpose. As an Azure Tech PM at Birlasoft, you will be responsible for leading and delivering complex data analytics projects. With 13-15 years of experience, you will play a critical role in overseeing the planning, execution, and successful delivery of data analytics initiatives, while managing a team of 15+ skilled resources. You should have exceptional communication skills, a deep understanding of Agile methodologies, and a strong background in managing cross-functional teams in data analytics projects. Key Responsibilities: - Lead end-to-end planning, coordination, and execution of data analytics projects, ensuring adherence to project scope, timelines, and quality standards. - Guide the team in defining project requirements, objectives, and success criteria using your extensive experience in data analytics. - Apply Agile methodologies to create and maintain detailed project plans, sprint schedules, and resource allocation for efficient project delivery. - Manage a team of 15+ technical resources, fostering collaboration and a culture of continuous improvement. - Collaborate closely with cross-functional stakeholders to align project goals with business objectives. - Monitor project progress, identify risks, issues, and bottlenecks, and implement mitigation strategies. - Provide regular project updates to executive leadership, stakeholders, and project teams using excellent communication skills. - Facilitate daily stand-ups, sprint planning, backlog grooming, and retrospective meetings to promote transparency and efficiency. - Drive the implementation of best practices for data analytics, ensuring data quality, accuracy, and compliance with industry standards. - Act as a point of escalation for project-related challenges and work with the team to resolve issues promptly. - Collaborate with cross-functional teams to ensure successful project delivery, including testing, deployment, and documentation. - Provide input to project estimation, resource planning, and risk management activities. Mandatory Experience: - Technical Project Manager experience of minimum 5+ years in Data lake and Data warehousing (DW). - Strong understanding of DW process execution from acquiring data to visualization. - Exposure to Azure skills such as Azure ADF, Azure Databricks, Synapse, SQL, PowerBI for minimum 3+ years or experience in managing at least 2 end-to-end Azure Cloud projects. Other Qualifications: - Bachelor's or Master's degree in Computer Science, Information Systems, or related field. - 13-15 years of progressive experience in technical project management focusing on data analytics and data-driven initiatives. - In-depth knowledge of data analytics concepts, tools, and technologies. - Exceptional leadership, team management, interpersonal, and communication skills. - Demonstrated success in delivering data analytics projects on time, within scope, and meeting quality expectations. - Strong problem-solving skills and proactive attitude towards identifying challenges. - Project management certifications such as PMP, PMI-ACP, CSM would be an added advantage. - Ability to thrive in a dynamic and fast-paced environment, managing multiple projects simultaneously.,

Posted 4 days ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

The Business Analyst position at Piramal Critical Care (PCC) within the IT department in Kurla, Mumbai involves acting as a liaison between PCC system users, software support vendors, and internal IT support teams. The ideal candidate is expected to be a technical contributor and advisor to PCC business users, assisting in defining strategic application development and integration to support business processes effectively. Key stakeholders for this role include internal teams such as Supply Chain, Finance, Infrastructure, PPL Corporate, and Quality, as well as external stakeholders like the MS Support team, 3PLs, and Consultants. The Business Analyst will report to the Chief Manager- IT Business Partner. The ideal candidate should hold a B.S. in Information Technology, Computer Science, or equivalent, with 8-10 years of experience in Data warehousing, BI, Analytics, and ETL tools. Experience in the Pharmaceutical or Medical Device industry is required, along with familiarity with large global Reporting tools like Qlik/Power BI, SQL, Microsoft Power Platform, and other related platforms. Knowledge of computer system validation lifecycle, project management tools, and office tools is also essential. Key responsibilities of the Business Analyst role include defining user and technical requirements, leading implementation of Data Warehousing, Analytics, and ETL systems, managing vendor project teams, maintaining partnerships with business teams, and proposing IT budgets. The candidate will collaborate with IT and business teams, manage ongoing business applications, ensure system security, and present project updates to the IT Steering committee. The successful candidate must possess excellent interpersonal and communication skills, self-motivation, proactive customer service attitude, leadership abilities, and a strong service focus. They should be capable of effectively communicating business needs to technology teams, managing stakeholder expectations, and working collaboratively to achieve results. Piramal Critical Care (PCC) is a subsidiary of Piramal Pharma Limited (PPL) and is a global player in hospital generics, particularly Inhaled Anaesthetics. PCC is committed to delivering critical care solutions globally and maintaining sustainable growth for stakeholders. With a wide presence across the USA, Europe, and over 100 countries, PCC's product portfolio includes Inhalation Anaesthetics and Intrathecal Baclofen therapy. PCC's workforce comprises over 400 employees across 16 countries and is dedicated to expanding its global footprint through new product additions in critical care. Committed to corporate social responsibility, PCC collaborates with partner organizations to provide hope and resources to those in need while caring for the environment.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

NTT DATA is looking for an exceptional, innovative, and passionate individual to join their team as a Cloud Solution Delivery Advisor in Bengaluru, Karnataka, India. If you are seeking to be part of an inclusive, adaptable, and forward-thinking organization, this opportunity is for you. As a Cloud Solution Delivery Advisor at NTT DATA, you will be responsible for utilizing your expertise in the Microsoft BI Technology stack, particularly in SSIS and Azure ADF. Your role will involve supporting and troubleshooting SSIS and Azure ADF data load jobs, along with demonstrating good communication skills. Additionally, you will be expected to have an in-depth understanding of enterprise-level data warehouse systems and possess expertise in performance tuning SQL programs using SSIS and Azure ADF. Experience in working with Agile models and CI CD processes will be an advantage, and your contribution to process improvement ideas will be highly valued. About NTT DATA: NTT DATA is a trusted global innovator of business and technology services with a revenue of $30 billion. They serve 75% of the Fortune Global 100 companies and are dedicated to helping clients innovate, optimize, and transform for long-term success. Recognized as a Global Top Employer, NTT DATA has diverse experts across more than 50 countries and a robust partner ecosystem. Their services range from business and technology consulting to data and artificial intelligence solutions, industry-specific offerings, and the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a prominent provider of digital and AI infrastructure globally and is part of the NTT Group, which invests significantly in research and development to support organizations and society in navigating the digital future confidently and sustainably. If you are looking to be part of a dynamic team at NTT DATA and contribute to cutting-edge technology solutions, apply now to seize this exciting opportunity.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

nagpur, maharashtra

On-site

About the Company: HCL Technologies is a global technology company that helps enterprises reimagine their businesses for the digital age. The company's mission is to deliver innovative solutions that drive business transformation and enhance customer experiences. About the Role: We are looking for a skilled professional to join our team, focusing on technical expertise in various technologies and platforms. Responsibilities: - Python - Azure ADF - API Development - Azure Databricks - CI/CD - DevOps - Terraform - AWS - Big Data To apply for this position, please share your resume at sanchita.mitra@hcltech.com.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

NTT DATA is looking for a UPS - GD Offshore - Senior Developer - SCS DIGITAL PLATFORM to join their team in Bangalore, Karnataka, India. As a Senior Developer, you will be responsible for various tasks related to Azure ADF, SSIS, Data Modeling, and MS SQL Programming. The ideal candidate should have at least 5 years of hands-on experience in these areas and should be proficient in SQL programming, SQL tuning, Azure SQL databases, SQL Server Integration Services (SSIS), and Stored Procedure development. Additionally, strong skills in Data Modeling and Data Warehouse Implementations are required. NTT DATA is a $30 billion global innovator in business and technology services, serving 75% of the Fortune Global 100 companies. As a Global Top Employer, NTT DATA has a diverse team of experts in more than 50 countries and collaborates with various established and start-up companies. Their services include business and technology consulting, data and artificial intelligence solutions, industry-specific offerings, and the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a leading provider of digital and AI infrastructure worldwide and is part of the NTT Group, which invests significantly in research and development to support organizations and society in transitioning confidently into the digital future. If you are a passionate and innovative individual with the required skills and experience, and want to be part of a forward-thinking and inclusive organization like NTT DATA, apply now to join their team and grow with them towards long-term success. Visit us at us.nttdata.com for more information.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, you are part of a team of bright individuals working with cutting-edge technologies. Our purpose is to bring about real positive changes in an increasingly virtual world, transcending generational gaps and disruptions of the future. We are currently seeking experienced Power BI Professionals with 6-9 years of experience in the following areas: - Power BI Developer - Azure ADF Key Responsibilities: - Develop new Power BI reports or fix any data issues in existing reports - Support users for data validation - Assist the Data team in understanding functional requirements - Proficient in SQL and writing complex DAX queries - Capture new report specifications based on existing requirements - Coordinate with various groups to understand Report KPIs - Participate in data requirement sessions and develop Power BI reports - Provide solutions and design prototypes for use case reports - Specialize in different reporting tools - Assess report features and build report matrix Certifications: - Mandatory certifications required At YASH, you have the opportunity to shape a career that aligns with your aspirations while working in a collaborative and inclusive team environment. We promote career-oriented skilling models and harness technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is guided by four principles: - Flexible work arrangements and a free-spirited, emotionally positive environment - Agile self-determination, trust, transparency, and open collaboration - Providing all necessary support for achieving business goals - Offering stable employment with a great atmosphere and ethical corporate culture.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Bengaluru

Work from Office

We are looking for 7+ years relevant data engineering hands on work experience- data ingestion, processing, exploratory analysis to build solutions that deliver value through data as an asset. Data engineer build ,test and deploy data pipelines efficiently and reliably move data across systems and should be top of latest architectural trends on AZURE cloud . Folks who understand parallel and distributed processing, storage, concurrency, fault tolerant systems. Folks who thrive on new technologies, able to adapt and learn easily to meet the needs of next generation engineering challenges. Technical Skills (Must-Have) Applied experience with distributed data processing frameworks - Spark , Databricks with Python and SQL Must have worked at least 2 end-end data analytics projects with Databricks Configuration , Unity Catalog, Delta Sharing and medallion architecture. Applied experience with Azure Data services ADLS , Delta Lake , Delta Live Tables , Azure Storage, RBAC Applied experience with Unit Testing and System Integration Testing using Python framework. Applied experience with Devops to design and deploy CI/CD pipelines using Jenkins. AZURE Data Engineering(DP203) or DATABRICKS certification Prior working experience with high performance agile team - Scrum , JIRA ,JFrog and Confluence Nice to have IOT data driven product/platform development environment background is plus Medical and Healthcare domain expertise is plus"

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Department: Development Location: Pune, India Description Our bright team FastTrack their career with international exposure and ways of working based on agile development best practices from globally renowned technology consultancies. Key Responsibilities Responsibilities: Data Architect Creating data models that specify how data is formatted, stored, and retrieved inside an organisation. This comprises data models that are conceptual, logical, and physical. Creating and optimising databases, including the selection of appropriate database management systems (DBMS) and the standardisation and indexing of data. Creating and maintaining data integration processes, ETL (Extract, Transform, Load) workflows, and data pipelines to seamlessly transport data between systems. Collaborating with business analysts, data scientists, and other stakeholders to understand data requirements and align architecture with business objectives. Stay current with industry trends, best practices, and advancements in data management through continuous learning and professional development. Establishing processes for monitoring and improving the quality of data within the organisation. Implement data quality tools and practices to detect and resolve data issues. Requirements and Skills: Data Architect Prior experience in designing Data Warehouse, data modelling, database design, and data administration is required. Database Expertise: Knowledge of data warehousing ideas and proficiency in various database systems (e.g., SQL). Knowledge of data modelling tools such as Visual Paradigm is required. Knowledge of ETL methods and technologies (for example, Azure ADF, Events). Expertise writing complex stored procedures. Good understanding of Data Modelling Concepts like Star Schema ,SnowFlake etc Strong problem-solving and analytical skills are required to build effective data solutions. Excellent communication skills are required to work with cross-functional teams and convert business objectives into technical solutions. Knowledge of Data Governance: Understanding data governance principles, data security, and regulatory compliance. Knowledge of programming languages such as .net can be advantageous.,

Posted 3 weeks ago

Apply

4.0 - 9.0 years

4 - 9 Lacs

Chennai, Tamil Nadu, India

On-site

Core requirements - Solid SQL language skills Basic knowledge of data modeling Working knowledge with Snowflake in Azure, CI/CD process (with any tooling) Nice to have - Azure ADF ETL/ELT frameworks ER/Studio Really nice to have - Healthcare / life sciences experience GxP processes Sr DW Engineer (in addition to the above) Overseeing engineers while also performing the same work himself/herself Conducting design reviews, code reviews, and deployment reviews with engineers Solid data modeling, preferably using ER/Studio (or equivalent tool is fine) Solid Snowflake SQL optimization (recognizing and fixing poor-performing statements) Familiarity with medallion architecture (raw, refined, published or similar terminology)

Posted 1 month ago

Apply

4.0 - 9.0 years

4 - 9 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary Core requirements - Solid SQL language skills Basic knowledge of data modeling Working knowledge with Snowflake in Azure, CI/CD process (with any tooling) Nice to have - Azure ADF ETL/ELT frameworks ER/Studio Really nice to have - Healthcare / life sciences experience GxP processes Sr DW Engineer (in addition to the above) Overseeing engineers while also performing the same work himself/herself Conducting design reviews, code reviews, and deployment reviews with engineers Solid data modeling, preferably using ER/Studio (or equivalent tool is fine) Solid Snowflake SQL optimization (recognizing and fixing poor-performing statements) Familiarity with medallion architecture (raw, refined, published or similar terminology)

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Mumbai

Work from Office

Roles & Responsibilities: Resource must have 5+ years of hands on experience in Azure Cloud development (ADF + DataBricks) - mandatory Strong in Azure SQL and good to have knowledge on Synapse / Analytics Experience in working on Agile Project and familiar with Scrum/SAFe ceremonies. Good communication skills - Written & Verbal Can work directly with customer Ready to work in 2nd shift Good in communication and flexible Defines, designs, develops and test software components/applications using Microsoft Azure- Data-bricks, ADF, ADL, Hive, Python, Data bricks, SparkSql, PySpark. Expertise in Azure Data Bricks, ADF, ADL, Hive, Python, Spark, PySpark Strong T-SQL skills with experience in Azure SQL DW Experience handling Structured and unstructured datasets Experience in Data Modeling and Advanced SQL techniques Experience implementing Azure Data Factory Pipelines using latest technologies and techniques. Good exposure in Application Development. The candidate should work independently with minimal supervision

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

JOB DESCRIPTION We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server- side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy- first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities: Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms , ensuring real-time and accurate server-side event tracking. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Develop and manage custom tracking solutions leveraging Azure Clean Rooms , ensuring user NFAs are respected and privacy-compliant logic is implemented. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS) , Cosmos DB , and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager , including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azures data security standards and enterprise privacy frameworks. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. EXPERTISE AND QUALIFICATIONS Required Skills: Strong hands-on experience with Fabric and building scalable APIs. Experience in implementing Meta CAPI , Google Enhanced Conversions , and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms , with experience developing custom logic and code for clean data collaborations . Proficiency with Azure Cloud technologies , especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS , and Azure security best practices . Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management , specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills: Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes .

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Consultant - Software Engineer (with C#) JOB DESCRIPTION: We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server- side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy- first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities: Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms , ensuring real-time and accurate server-side event tracking. Develop and manage custom tracking solutions leveraging Azure Clean Rooms , ensuring user NFAs are respected and privacy-compliant logic is implemented. Architect and develop secure REST APIs in C# to support advanced attribution models and marketing analytics pipelines. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS) , Cosmos DB , and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager , including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azures data security standards and enterprise privacy frameworks. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. EXPERTISE AND QUALIFICATIONS Required Skills: Strong hands-on experience with C# and building scalable APIs. Experience in implementing Meta CAPI , Google Enhanced Conversions , and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms , with experience developing custom logic and code for clean data collaborations . Proficiency with Azure Cloud technologies , especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS , and Azure security best practices . Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Role & responsibilities

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Hyderabad, Delhi / NCR

Hybrid

Develop ETL pipelines using SQL and C# and Python Performance Tuning, Design scalable DB architecture Maintain Technical documentation Required Candidate profile 5+ years in SQL T-SQL, performance tuning, developing ETL processes. Hands on C# and WPF will be a plus Experience in AWS and Azure is must

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies