Home
Jobs

7 Adf Framework Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

20 - 27 Lacs

Noida, Chennai, Bengaluru

Hybrid

Naukri logo

Data engineer JD- 5+ years of experience with cloud data engineering (preferably Azure/ADB/ADF), data pipelines and Spark. Work with Databricks and Write optimized and efficient code using PySpark, SparkSQL and Python. Develop and maintain ETL processes using Databricks notebooks and workflows. Implement and optimize data pipelines for data transformation and integration. Knowledge of one or more SQL variant, preferably PL/SQL and Spark SQL. Write complex SQL queries for data retrieval, manipulation, and analysis. Debugging code when required and troubleshooting any Python, Pyspark or SQL related queries. Good Experience with version control (Git) and ci/cd. Excellent problem-solving ability with solid communication and collaboration skills Relevant candidate can drop a mail to roshini.k@wipro.com with updated resume and below details TEX : REX : Current company : CCTC : ECTC : Notice Period : (LWD If serving) Counter offer CTC if any Location :

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

share your cv to shilpa.srivastava@orcapod.work Quality Assurance Tester with ctc expected ctc and notice period max upto 15 days notice only The key skills required Test and QA Skills: Demonstrates a working understanding of planning, developing and coordinating testing activities including Test Plan creation, Test Case creation, debugging, execution, test analysis. Demonstarte an understanding on the estimation techaniques and QA plan Demonstrates analytical skills in assessing user, functional and technical requirements Demonstrates a working understanding of functional testing techniques and strategies. Demonstrates a strong understanding of web testing techniques and web testing strategies. Demonstrates a working understanding of Cloud(Azure) based services Demonstrates a working understanding of test analysis and design. Demonstrates a working understanding of analyzing test results and the creation of appropriate test metrics. Demonstrates a working understanding of the defect management process. Demonstrates a working understanding of quality assurance and/or software development processes and methodologies, with the ability to share that knowledge with peers, and project team members Identifies ways of working smarter, through elimination of unnecessary steps or duplication Tools/Technology Skills: Demonstrates an understanding and working experience on SQL databases and ETL testing. Should be able to write queries to validate the table mappings and structures Should be able to perform schema validations Good understanding of SCD types Strong knowledge of database methodology In-depth understanding of Data Warehousing/Business intelligence concepts Working experience on cloud(Azure) based services Working experience in testing BI reports Should be able to write queries to validate the data quality during migration projects Demonstrates an understanding of any of the peripheral technologies utilized in SDC, including Peoplesoft, SAP and Aderant. Demonstrates a working understanding of tools like UFT and TFS Experience with Microsoft tools is highly desired Understands enterprise wide networks and software implementations. Must have previous experience in creating complex SQL queries for data validation. Must have testing experience in Enterprise Data Warehouse (EDW)

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 13 Lacs

Chennai

Work from Office

Naukri logo

Roles and Responsibilities Design, develop, test, deploy, and maintain Azure Data Factory (ADF) pipelines for data integration. Collaborate with cross-functional teams to gather requirements and design solutions using ADF. Develop complex data transformations using SQL Server Integration Services (SSIS), DDL/DML statements, and other tools. Troubleshoot issues related to pipeline failures or errors in the pipeline execution process. Optimize pipeline performance by analyzing logs, identifying bottlenecks, and implementing improvements.

Posted 3 weeks ago

Apply

12 - 22 years

35 - 50 Lacs

Hyderabad, Pune, Chennai

Work from Office

Naukri logo

Looking for experts in any one of the following DATA ARCHITECT- ADF, HDInsight, Azure SQL, Pyspark, Python BI ARCHITECT- Tableau, Power BI and Azure SQL. MDM ARCHITECT- Reltio, Profisee, MDM INFORMATICA ARCHITECT- Informatica, MDM, SQL, Python.

Posted 1 month ago

Apply

7 - 12 years

12 - 20 Lacs

Chennai, Delhi NCR, Ahmedabad

Work from Office

Naukri logo

Develop and maintain data processing workflows using technologies such as Apache Beam, Apache Spark, or similar. Tools & Service - CI/CD (GitLab/GitHub, Ansible, ETL , Pipeline , workflow Required Candidate profile Experience on Ingestion framework Hadoop Cloudera , HDFS, HIVE Language – Python, Pyspark Good to have Cloud services – Azure , ADF , Event Hub

Posted 2 months ago

Apply

5 - 10 years

8 - 18 Lacs

Chennai

Work from Office

Naukri logo

Location: Chennai & Remote Role & responsibilities Role: ADF Developer Experience: 4+ years in enterprise application development, with hands-on experience in Oracle ADF frameworks. Location : Chennai (on-site) Technical Skills: Core Technologies: Java EE, Oracle ADF Business Components, ADF Faces, ADF Controller Integration: SOAP/REST web services, PL/SQL, XML, JSON Middleware: Familiarity with Oracle Fusion Middleware Key Responsibilities: Design and develop robust, user-friendly enterprise applications using Oracle ADF Develop and optimize business components and UI layers Integrate with backend systems and legacy applications Participate in code reviews and ensure adherence to quality standards Certifications: Oracle Certified Professional (OCP) or equivalent certification is highly preferred. Cybersecurity Considerations: Apply secure coding practices and perform threat modeling during development Collaborate with security teams to remediate vulnerabilities Ensure data protection and compliance with security standards during integration Soft Skills: Strong problem-solving and analytical abilities Excellent communication and collaboration skills Experience working in agile environments Project Relevance: Proven experience in delivering scalable and secure enterprise applications Ability to integrate modern technologies with existing systems while ensuring compliance with cybersecurity best practices

Posted 2 months ago

Apply

3 - 5 years

12 - 22 Lacs

Bengaluru, Bangalore Rural

Work from Office

Naukri logo

Experience - 3-5 Years (Senior Analyst) Location - Bangalore - Primary, Gurgaon - Secondary Work Mode - Hybrid (3 Days Office) Excellent Communication is mandatory. ROLE SUMMARY Reporting to Head of Architecture & Engineering APAC, this is a key role in analyzing, building, and maintaining highly scalable data pipelines and automation on a variety of projects and products across a wide range of data sources. You will play an important role in converting complex data into meaningful insights in this pivotal role. You will work closely with our APAC and global teams and partners developing state of art data solutions to modernize our business and shape our engineering landscape. You are someone who is passionate about data and automation using cloud and collaboration tools such as MS Azure and M365, combined with software development experience using SQL and Python/.NET. Although the primary focus for this role is applications and data, you will be working across a variety of areas that require this role to be a generalist across software development. PRIMARY ROLE Design, build, deploy and optimize data pipelines using ETL from various sources such as Financial, Investment and Operational data using CI/CD processes , primarily using Python and DataBricks . Integrate diverse data sources, internal and external, into our strategic data platforms. Develop application and workflow automation using cloud and M365 technologies to support our data and automation strategy. Develop API and database interface to support projects and integrations. Work on application development projects that may be needed to support our overall data and technology roadmap. Provide technical analysis and R&D to solve business problems and help with technical and design decisions. KEY WORKING RELATIONSHIPS External: Development & Integration partners Technology and Cyber Security partners Cloud and application vendors Portfolio Companies Internal: CTO APAC Heads of Architecture & Engineering, Workplace and Service Delivery DevOps engineers Internal business stakeholders WHAT YOU BRING TO THE ROLE Required: 3+ years experience working in development roles focused on data, integrations and automation. Experience in developing and deploying application and data pipelines using Microsoft Azure, Databricks, ETL and Python. Experience in using Azure stack such as ADF, Functions and LogicApps Experience in developing automation using power automate or similar. Background in finance, investment or property industry. Excellent communication and customer management skills Education: A degree in information technology or equivalent Preferred: Experience in multiple clouds using multiple cloud services. Experience with Data Warehousing and analytics, preferably using Snowflake and Power BI Exposure to Machine Learning, AI and Big data Experience with RPA and No-Code platforms Interested candidates can share the updated resume to anusha.bc7@wipro.com. Reference are welcome

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies