Jobs
Interviews

16 Adf Framework Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Roles and Responsibilities:- You must be an experienced Azure Cloud engineer with 8-10 years of Experience as part of AI engineering team to develop, implement, optimize, and maintain cloud-based solutions. You will be responsible for deploying and debugging cloud stacks, adopting and implementing best cloud practices, and ensuring the security of the cloud infrastructure. Responsibilities As part of project scrum team member design and implement the most optimal cloud-based solutions for the company Ensure application performance, uptime, and scale, maintaining high standards for code quality and thoughtful design Modifying and improving existing systems. Educating teams on the implementation of new cloud technologies and initiatives. Ensuring efficient functioning of data storage and processing functions in accordance with company security policies and best practices in cloud security. Identifying, analyzing, and resolving infrastructure vulnerabilities and application deployment issues. Define and document best practices and strategies regarding application deployment and infrastructure maintenance. Regularly reviewing existing systems and making recommendations for improvements. Qualifications Degree in computer science or a similar field. Eight or more years of experience in architecting, designing, developing, and implementing cloud solutions in Azure or AWS Understanding of core cloud concepts like Infra as code, IaaS, PaaS and SaaS . Strong proficiency in Python and experience with REST API development . Design and implement scalable, secure, and efficient cloud-based solutions using Azure services. Develop and maintain RESTful APIs to support various applications. Technologies: Python, Terraform, Azure app services, Functions, App Insights, ADF in AZURE Similar Technology stack for AWS like ECS, Lambda, S3, Glue jobs etc Developing and maintaining continuous integration and continuous deployment pipelines Jenkins Groovy scripts. Developing containerized solutions and orchestration (Docker, Kubernetes, App Service or ECS) Experience of server less architecture, cloud computing, cloud native application and scalability etc Collaborate with cross-functional teams to define, design, and ship new features. Optimize applications for maximum speed and scalability. Implement robust security measures and ensure compliance with industry standards. Monitor and troubleshoot application performance and resolve any issues. Participate in code reviews and contribute to the continuous improvement of the development process. Development experience with configuration management tools (Terraform, Ansible, Arm Templates). Relevant certification of Azure/AWS preferred. Troubleshooting and analytical skills. Knowledge of AI & ML technologies, as well as ML model management is a plus.

Posted 5 days ago

Apply

5.0 - 10.0 years

10 - 16 Lacs

Hyderabad

Remote

Job description As an ETL Developer for the Data and Analytics team, at Guidewire you will participate and collaborate with our customers and SI Partners who are adopting our Guidewire Data Platform as the centerpiece of their data foundation. You will facilitate and be an active developer when necessary to operationalize the realization of the agreed upon ETL Architecture goals of our customers adhering to Guidewire best practices and standards. You will work with our customers, partners, and other Guidewire team members to deliver successful data transformation initiatives. You will utilize best practices for design, development, and delivery of customer projects. You will share knowledge with the wider Guidewire Data and Analytics team to enable predictable project outcomes and emerge as a leader in our thriving data practice. One of our principles is to have fun while we deliver, so this role will need to keep the delivery process fun and engaging for the team in collaboration with the broader organization. Given the dynamic nature of the work in the Data and Analytics team, we are looking for decisive, highly-skilled technical problem solvers who are self-motivated and take proactive actions for the benefit of our customers and ensure that they succeed in their journey to Guidewire Cloud Platform. You will collaborate closely with teams located around the world and adhere to our core values Integrity, Collegiality, and Rationality. Key Responsibilities: Build out technical processes from specifications provided in High Level Design and data specifications documents. Integrate test and validation processes and methods into every step of the development process Work with Lead Architects and provide inputs into defining user stories, scope, acceptance criteria and estimates. Systematic problem-solving approach, coupled with a sense of ownership and drive Ability to work independently in a fast-paced Agile environment Actively contribute to the knowledge base from every project you are assigned to. Qualifications: Bachelors or Masters Degree in Computer Science, or equivalent level of demonstrable professional competency, and 3 - 5 years + in a technical capacity building out complex ETL Data Integration frameworks. 3+ years of Experience with data processing and ETL (Extract, Transform, Load) and ELT (Extract, Load, and Transform) concepts. Experience with ADF or AWS Glue, Spark/Scala, GDP, CDC, ETL Data Integration, Experience working with relational and/or NoSQL databases Experience working with different cloud platforms (such as AWS, Azure, Snowflake, Google Cloud, etc.) Ability to work independently and within a team. Nice to have: Insurance industry experience Experience with ADF or AWS Glue Experience with the Azure data factory, Spark/Scala Experience with the Guidewire Data Platform.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

19 - 27 Lacs

Chennai

Hybrid

Location - SIPCOT - Siruseri Work Location - Siruseri. Work Mode - 3 days a week - Hybrid - in the office T.E - 5 yrs to 10 yrs No of positions - 4 Notice Period - 1 week to 2 weeks. The JD is as follows: Must of 5+ years to 10 years of IT experience and at least 5 years of hands-on experience in Databricks. Design and develop scalable data pipelines using Azure Databricks, Python, PySpark, and Delta Lake. Have a good understanding of RDBMS and proficiency in writing complex SQL. Implement ETL/ELT workflows for structured, semi-structured, and unstructured data. Optimise the performance of data processing and ensure data quality standards. Work with Azure Data Lake, Unity Catalogue, and integrate with CI/CD pipelines. Experience with orchestration tools like Airflow or Azure Data Factory is a plus. Have working experience on a migration project. Resumes can be shared with devika.raju@kumaran.com or WhatsApp - 8122770798

Posted 2 weeks ago

Apply

3.0 - 5.0 years

12 - 15 Lacs

Pune

Work from Office

React, Node.js, Python, JavaScript (optionally PHP) AWS (Lambda, EC2, S3, API Gateway, CodePipeline), Azure Docker, Kubernetes, CI/CD (GitHub, AWS CodePipeline) SQL, NoSQL, Redis, WebSockets, Message Queues ETL Tools: AWS Glue, ADF, SSIS, KNIME Required Candidate profile DevOps and Infrastructure Monitoring Developing full stack cloud-native applications Managing data pipelines and cloud infrastructure Ensuring CI/CD practices, performance tuning, and code quality

Posted 2 weeks ago

Apply

6.0 - 11.0 years

25 - 30 Lacs

Mumbai, Mumbai Suburban, Mumbai (All Areas)

Work from Office

Experience in using SQL, PL/SQL or T-SQL with RDBMSs like Teradata, MS SQL Server, or Oracle in production environments. Experience with Python, ADF,Azure,Data Ricks. Experience working of Microsoft Azure/AWS or other leading cloud platforms Required Candidate profile Hands-on experience with Hadoop, Spark, Hive, or similar frameworks. Data Integration & ETL Data Modelling Database management Data warehousing Big-data framework CI/CD Perks and benefits To be disclosed post interview

Posted 3 weeks ago

Apply

8.0 - 13.0 years

12 - 22 Lacs

Noida

Work from Office

For Solution Architect-At least 8 years of experience in APEX, ADF, Workflow, ATP, Pl/SQL and OIC. Immediate Joiner only req, Job type: Consultant for 3 months Location: Noida Please share your profile to anwar.shaik@locuz.com & priyanka.p@locuz.com

Posted 3 weeks ago

Apply

8.0 - 12.0 years

18 - 25 Lacs

Bengaluru

Hybrid

We encourage applications from passionate Java developers who can join us immediately or within 15 to 30 days and are excited to make an impact. Role & responsibilities Design, develop, and maintain scalable microservices using Java (Java 8 to Java 21) and Spring Boot. Architect, implement, and support cloud-native applications leveraging microservice architecture. Build and integrate event-driven systems using Apache Kafka. Develop and maintain CI/CD pipelines ensuring smooth automated deployments (Jenkins, GitHub Actions, etc.). Work with relational databases (Oracle, SQL) as well as NoSQL databases, ensuring data integrity and performance optimization. Write robust, maintainable, and testable code using JUnit and industry best practices. Use Gradle or Maven for build and dependency management. Build and deploy containerized applications using Docker; manage images and registries. Collaborate effectively using GitHub for version control and code reviews. Participate in architectural discussions and provide solutions to complex business problems. Troubleshoot and resolve production issues; ensure system reliability and performance. Mentor junior team members and contribute to improving coding standards and practices. Preferred candidate profile Strong hands-on expertise in Java 8 and above (up to Java 21), with solid experience in Spring Boot. Proven experience in developing and deploying microservice-based architectures. Experience with cloud platforms especially Azure (AKS, ADF, Databricks) is highly preferred. Good understanding of containerization (Docker) and orchestration concepts. Knowledge of CI/CD tools and modern DevOps practices. Proficient in working with both SQL (Oracle, MySQL, etc.) and NoSQL databases. Experience with build tools like Gradle or Maven. Familiar with development tools and IDEs such as IntelliJ IDEA or Eclipse. Strong understanding of version control using GitHub. Good to Have: Knowledge of Python for scripting or automation. Shell scripting skills and familiarity with basic Linux commands. Exposure to Azure-specific services and data engineering workflows (ADF, Databricks). Key Skills: Java, Spring Boot, Microservices, Kafka, Docker, CI/CD, Oracle/SQL, NoSQL, Gradle, Maven, GitHub, Azure, AKS, ADF, Databricks, JUnit, IntelliJ/Eclipse. Why Join Us? Opportunity to work on cutting-edge cloud-native solutions. Collaborative and innovative work environment. Exposure to modern tech stack and latest tools. Strong career growth and learning opportunities.

Posted 3 weeks ago

Apply

7.0 - 9.0 years

14 - 24 Lacs

Chennai

Hybrid

Role: Azure Integration Developer Location :Chennai Must have experience with the Boomi / Azure platform, Boomi Professional Developer, Azure Integration Certification. Hybrid At leas t 7-9 year s of experience in Integration development. Certifications in relevant integration technologies - Boomi Professional Developer, Azure Integration Certification. Must have experience with the Boomi / Azure platform, Experience with Azure Integration Services (ASB, Function Apps, ADF, ASB, Logic Apps) Good understanding of Azure Cloud and Azure APIM Good understanding of all Integration design patterns Strong experience with APIs, web services (REST/SOAP), and message queuing systems. Strong knowledge of XML, JSON, XSLT, and other data interchange formats. Familiarity with cloud platforms (AWS, Azure, Google Cloud) and microservices architecture. Strong problem-solving and analytical abilities. Effective communication and interpersonal skills. Ability to work independently and as part of a team. Detail-oriented with a focus on quality and accuracy. Nice to Have Experience with DevOps practices and tools (CI/CD pipelines, Docker, Kubernetes). • Knowledge of data security and compliance standards. Familiarity with Agile/Scrum methodologies. Excellent Communication Skills' Notice Period : Immediate Joiner to 15 days Interested Candidate Share Resume at dipti.bhaisare@in.experis.com

Posted 1 month ago

Apply

5.0 - 10.0 years

18 - 25 Lacs

Hyderabad

Work from Office

Highly skilled Senior Data engineer with over 6 years of experience in designing, developing, and implementing advanced business intelligence solutions on Microsoft Azure. The engineer should have hands-on expertise in ADF, Synapse and PowerBl and Azure DevOps platform. Key Responsibilities: Collaborate with stakeholders to plan, design, develop, test, and maintain the KPI data and dashboards on Azure and PowerBl. The candidate would have to have the following skills: Proficient in ETL processes, data modelling, and DAX query language on Microsoft Azure. Proven track record of collaborating with stakeholders to gather requirements and deliver actionable insights. Independently handle DevOps in ADF, Synapse ad PowerBl Proficient in business requirements gathering and analysis. Strong data analysis skills, including data interpretation and visualization. Familiarity with process modelling and documentation. Adept at creating interactive and visually compelling reports and dashboards to support data-driven decision-making in PowerBl. Excellent stakeholder management and communication skills. Knowledge of Agile methodologies and project management practices. Ability to develop and articulate clear and concise user stories and functional requirements. Proficiency in using data visualization tools like Power BI. Comfortable with conducting user acceptance testing (UAT) and quality assurance. Educational Qualification: Graduate/Post Graduate degree Computer Science, Masters in Business Administration, Certification in PowerBl and Microsoft Azure Services Skills:- Proven experience of 6 + years as a data Engineer and Data Visualization developer Expertise and experience in ADF, Synapse and PowerBl Demonstrates an understanding of the IT environment, including enterprise applications like HRMS, ERPs, CRM, Manufacturing systems, API management, Webscrapingetc. Industry experience in Manufacturing

Posted 1 month ago

Apply

8.0 - 12.0 years

10 - 20 Lacs

Hyderabad

Work from Office

We are seeking an experienced Senior SQL Developer with ADF (Azure Data Factory) to join our team, with a specific focus on T-SQL. You will be responsible for designing, developing, and maintaining database structures, optimizing queries, and managing complex configurations. All of this must be achieved while adhering to the highest security best practices. Responsibilities: Design, implement, and maintain database structures. Should be able to create complex ADF pipelines with different source and destination like blob storage, sftp, azure sql database. Expert in developing solutions and deliver the projects using Azure Data Factory Develop and update existing stored procedures and functions using T-SQL. Research required data and develop complex SQL queries. Develop procedures and scripts for data migration. Ensure performance, security, and availability of databases. Collaborate with other team members and stakeholders. Prepare documentation and specifications related to database design and architecture. Required Skills and Experience: Minimum of 8+ years of experience in SQL Development with ADF. Proven work experience as a Senior SQL Developer or similar role. Utilize technical skills in T-SQL, SQL SERVER, MSBI, ADF, SSIS, and Azure Data Lake Storage to develop and maintain MS SQL solutions. Familiarity with Power BI and Azure Databricks is a plus. Excellent understanding of T-SQL programming. Knowledge of SQL Server Management Studio (SSMS), SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS). Proficient understanding of indexes, views, handling large data, and complex joins. Experience with performance tuning, query optimization, using Performance Monitor, and other related monitoring and troubleshooting tools. Excellent written and verbal communication skills.

Posted 1 month ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Chennai

Work from Office

Job Summary: We are hiring an experienced Application Security Engineer specializing in Java ADF and Jasper Reports, with a strong track record of resolving Vulnerability Assessment and Penetration Testing (VAPT) findings. The ideal candidate must have secured complex enterprise applications, including online payments and eCommerce systems, particularly on legacy stacks such as Java 1.7, MySQL 5.5, and JBoss 7.1. This role is hands-on and remediation-focused, requiring deep understanding of secure development and hardening in deprecated environments. Key Responsibilities: Lead remediation of high-priority VAPT findings in large-scale enterprise systems. Secure passwords and PII data at all stages: At view/input: masking, form validation, secure front-end patterns In transit: TLS, secure headers, HTTPS enforcement At rest: encryption, proper salting and hashing (e.g., bcrypt, SHA-256) Fix injection attacks (SQLi, XSS, LDAPi, command injection), CSRF, clickjacking, IDOR, and other OWASP Top 10 issues. Apply secure API integration practices: auth tokens, rate limiting, input validation. Harden session and cookie management (HttpOnly, Secure, SameSite attributes, session fixation prevention). Review and fix insecure code in ADF Faces, Task Flows, Bindings, BC4J, and Jasper Reports. Secure Jasper Reports generation and access (parameter validation, report-level authorization, export sanitization). Work hands-on with legacy platforms: Java 1.7, MySQL 5.5, JBoss 7.1 applying secure remediation without disrupting production. Strengthen security of online payment/eCommerce systems with proven compliance (e.g., PCI-DSS). Maintain detailed remediation logs, documentation, and evidence for audits and compliance (GDPR, DPDPA, STQC, etc.). Technical Skills: Java EE, Oracle ADF (ADF Faces, Task Flows, BC4J), Jasper Reports Studio/XML Strong debugging skills in Java 1.7, MySQL 5.5, JBoss 7.1 Secure development lifecycle practices with a focus on legacy modernization Strong grounding in OWASP Top 10, SANS 25, CVSS, and secure coding principles Experience in PII handling, data masking, salting, and hashing Proficiency in OAuth2, SAML, JWT, and RBAC security models Performance improvement and application profiling Expertise in analyzing application, system, and security logs to identify and fix issues Ability to ensure application stability and high availability Be the champion/lead and guide the team to fix the issues PHP experience is a plus, especially in legacy web app environments Required Experience: 5–10+ years in application development and security Demonstrated experience remediating security vulnerabilities in eCommerce and payment platforms Ability to work independently in production environments with deprecated technologies Preferred Qualifications / Plus: B.E./B.Tech/MCA in Computer Science, IT, or Cybersecurity Use of AI tools for identification and fixing the issues is real plus Any VAPT or Application Security Certification is a plus (e.g., CEH, OSCP, CSSLP, GWAPT, Oracle Certified Expert) Familiarity with compliance standards: PCI-DSS, GDPR, DPDPA, STQC Proficiency with security tools: Fortify, ZAP, SonarQube, Checkmarx, Burp Suite Soft Skills: Strong problem-solving and diagnostic capabilities, especially in large monolithic codebases Good documentation and communication skills for cross-functional collaboration Able to work under pressure, troubleshoot complex issues, and deliver secure code fixes rapidly

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 13 Lacs

Pune, Chennai, Mumbai (All Areas)

Hybrid

Hello Candidates, We are Hiring !! Job Position - Sr. Alfresco Developer Experience - 5+ Years Location- Pune , Mumbai , Chennai Work mode - Hybrid (3 days WFO ) JOB DESCRIPTION We are seeking a skilled Alfresco Developer to join our team, responsible for designing, developing, and implementing enterprise content management (ECM) solutions using Alfresco. The ideal candidate will have a strong background in Alfresco architecture, content modeling, and integration with enterprise systems, alongside DevOps experience for deployment and scalability. ________________________________________ Key Responsibilities: • Design and develop custom solutions using Alfresco Content Services, Share, and ADF. • Develop and manage content models, workflows, and security models (ACLs). • Create and manage Alfresco module packages for system customization. • Implement and maintain Records Management and Governance Services features. • Utilize Alfresco APIs for application integration and extension. • Integrate Alfresco with external enterprise applications (e.g., ERP, CRM). • Configure and optimize Solr for content indexing and search performance. • Support and enhance system security, permissions, and compliance. • Work with DevOps tools including Docker and containerization for deployment and CI/CD processes. • Administer and optimize databases including PostgreSQL or MySQL. • Monitor, troubleshoot, and resolve production issues. ________________________________________ Required Skills & Qualifications: • Strong experience with Alfresco Content Services and Alfresco Share / ADF. • Expertise in content modeling, workflow design, and security (ACLs). • Familiarity with Alfresco Governance Services and Records Management. • Proficiency with Alfresco APIs (REST, CMIS). • Solid understanding of Solr indexing and search integration. • Experience with enterprise application integration. • Hands-on experience with Docker, containers, and CI/CD pipelines. • Working knowledge of PostgreSQL or MySQL. • Excellent problem-solving skills and the ability to work independently or in a team environment. ________________________________________ Preferred Qualifications: • Alfresco Certified Engineer or Administrator (ACE / ACA) is a plus. • Experience with Kubernetes or cloud platforms (AWS, Azure) is an advantage. • Familiarity with Agile development practices. NOTE - Intrested candidates can share their resume - shrutia.talentsketchers@gmail.com

Posted 1 month ago

Apply

6.0 - 9.0 years

20 - 27 Lacs

Noida, Chennai, Bengaluru

Hybrid

Data engineer JD- 5+ years of experience with cloud data engineering (preferably Azure/ADB/ADF), data pipelines and Spark. Work with Databricks and Write optimized and efficient code using PySpark, SparkSQL and Python. Develop and maintain ETL processes using Databricks notebooks and workflows. Implement and optimize data pipelines for data transformation and integration. Knowledge of one or more SQL variant, preferably PL/SQL and Spark SQL. Write complex SQL queries for data retrieval, manipulation, and analysis. Debugging code when required and troubleshooting any Python, Pyspark or SQL related queries. Good Experience with version control (Git) and ci/cd. Excellent problem-solving ability with solid communication and collaboration skills Relevant candidate can drop a mail to roshini.k@wipro.com with updated resume and below details TEX : REX : Current company : CCTC : ECTC : Notice Period : (LWD If serving) Counter offer CTC if any Location :

Posted 2 months ago

Apply

5.0 - 10.0 years

8 - 17 Lacs

Hyderabad

Work from Office

share your cv to shilpa.srivastava@orcapod.work Quality Assurance Tester with ctc expected ctc and notice period max upto 15 days notice only The key skills required Test and QA Skills: Demonstrates a working understanding of planning, developing and coordinating testing activities including Test Plan creation, Test Case creation, debugging, execution, test analysis. Demonstarte an understanding on the estimation techaniques and QA plan Demonstrates analytical skills in assessing user, functional and technical requirements Demonstrates a working understanding of functional testing techniques and strategies. Demonstrates a strong understanding of web testing techniques and web testing strategies. Demonstrates a working understanding of Cloud(Azure) based services Demonstrates a working understanding of test analysis and design. Demonstrates a working understanding of analyzing test results and the creation of appropriate test metrics. Demonstrates a working understanding of the defect management process. Demonstrates a working understanding of quality assurance and/or software development processes and methodologies, with the ability to share that knowledge with peers, and project team members Identifies ways of working smarter, through elimination of unnecessary steps or duplication Tools/Technology Skills: Demonstrates an understanding and working experience on SQL databases and ETL testing. Should be able to write queries to validate the table mappings and structures Should be able to perform schema validations Good understanding of SCD types Strong knowledge of database methodology In-depth understanding of Data Warehousing/Business intelligence concepts Working experience on cloud(Azure) based services Working experience in testing BI reports Should be able to write queries to validate the data quality during migration projects Demonstrates an understanding of any of the peripheral technologies utilized in SDC, including Peoplesoft, SAP and Aderant. Demonstrates a working understanding of tools like UFT and TFS Experience with Microsoft tools is highly desired Understands enterprise wide networks and software implementations. Must have previous experience in creating complex SQL queries for data validation. Must have testing experience in Enterprise Data Warehouse (EDW)

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 13 Lacs

Chennai

Work from Office

Roles and Responsibilities Design, develop, test, deploy, and maintain Azure Data Factory (ADF) pipelines for data integration. Collaborate with cross-functional teams to gather requirements and design solutions using ADF. Develop complex data transformations using SQL Server Integration Services (SSIS), DDL/DML statements, and other tools. Troubleshoot issues related to pipeline failures or errors in the pipeline execution process. Optimize pipeline performance by analyzing logs, identifying bottlenecks, and implementing improvements.

Posted 2 months ago

Apply

12 - 22 years

35 - 50 Lacs

Hyderabad, Pune, Chennai

Work from Office

Looking for experts in any one of the following DATA ARCHITECT- ADF, HDInsight, Azure SQL, Pyspark, Python BI ARCHITECT- Tableau, Power BI and Azure SQL. MDM ARCHITECT- Reltio, Profisee, MDM INFORMATICA ARCHITECT- Informatica, MDM, SQL, Python.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies