Home
Jobs
Companies
Resume

1032 Adf Jobs - Page 39

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 8 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year Of Experience Required Minimum 2 to 4 Years of Oracle fusion experienceEducational Qualification Graduate /Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Middleware (OFM) Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 2 months ago

Apply

8 - 12 years

11 - 15 Lacs

Hyderabad, Gurgaon

Work from Office

Naukri logo

Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visit. PepsiCo Data Analytics & AI Overview: With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCos leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AIProducts. And will support pre-engagement activities as requested and validated by the prioritization framework of DA&AI. Data Science Senior Analyst: Hyderabad and Gurugram Your role will be to be part of growing team based in Hyderabad, to create and support global digital developments for PepsiCo. These developments will be around topics like revenue management, supply chain, manufacturing, logistics. This Role focuses on Demand Forecasting in the Demand Planning functional area. You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. You will be an internal ambassador of the teams culture around data and analytics. You will provide stewardship to colleagues in the areas that you are an specialist or you are specializing. Responsibilities Active contributor to code development in projects and services Act as contributor in innovation activities Partner with data scientist working on discovery, prototypes and pilot. Focus on experiment tracking. Partner with data engineers regarding data pipelines and data versioning. Build, deploy or publish machine learning models that run efficiently in cloud pipelines. Heavy focus on data wrangling implementation, business rules implementation, feature engineering at scale Help drive optimization, testing, and tooling to improve quality (unit tests). Monitor model performance (model decay), working with data scientists assessing bias and fairness. Prototype new approaches and build solutions at scale. Support large-scale experimentation done by data scientists. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries. Qualifications 8+ years working in a team to deliver production level analytic solutions. 4+ years experience in ETL (ADF and Data bricks) and/or data wrangling 4+ years experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. Experiences with Deep Learning are a plus. 4+ years experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics product creation. Fluent in Jira, Confluence. Experience in cloud-based development and deployment (Azure preferred) Fluent with Azure cloud services. Certification is a plus. Fluent in git (version control) Fluent in Docker

Posted 2 months ago

Apply

4 - 9 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Highly skilled Senior Data engineer with over 5 years of experience in designing, developing, and implementing advanced business intelligence solutions on Microsoft Azure. The engineer should have hands-on expertise in ADF, Synapse and PowerBI and Azure DevOps platform. Key Responsibilities: Collaborate with stakeholders to plan, design, develop, test, and maintain the KPI data and dashboards on Azure and PowerBI. The candidate would have to have the following skills: Proficient in ETL processes, data modelling, and DAX query language on Microsoft Azure. Proven track record of collaborating with stakeholders to gather requirements and deliver actionable insights. Independently handle DevOps in ADF, Synapse ad PowerBI Proficient in business requirements gathering and analysis. Strong data analysis skills, including data interpretation and visualization. Familiarity with process modelling and documentation. Adept at creating interactive and visually compelling reports and dashboards to support data-driven decision-making in PowerBI. Excellent stakeholder management and communication skills. Knowledge of Agile methodologies and project management practices. Ability to develop and articulate clear and concise user stories and functional requirements. Proficiency in using data visualization tools like Power BI. Comfortable with conducting user acceptance testing (UAT) and quality assurance. Educational Qualification Graduate/Post Graduate degree Computer Science, Masters in Business Administration, Certification in PowerBI and Microsoft Azure Services Experience At least 6 years of experience in delivering large enterprise Analytics projects Overall experience of 08 - 10 years in Enterprise IT or Business The Industry to be hired from Chemical/Pharma/FMCG Manufacturing IT Big 4, IT/ITES Organizations- Analytics organizations Skills Proven experience of 6 + years as a data Engineer and Data Visualization developer Expertise and experience in ADF, Synapse and PowerBI Demonstrates an understanding of the IT environment, including enterprise applications like HRMS, ERPs, CRM, Manufacturing systems, API management, Webscrapingetc. Industry experience in Manufacturing Competencies (behavioural skills) required Excellent communication skills Analytical skills and strong organizational abilities Attention to detail Problem-solving aptitude Impact and influence across cross functional teams Leadership Skills and Experience Well organized, highly system and process-oriented approach Good business partner with customer orientation. Self-starter who can manage a range of competing priorities and projects. Ability to question status quo and bring in best in class practices. Ability to inspire and rally team around business objectives and excellence

Posted 2 months ago

Apply

5 - 10 years

20 - 22 Lacs

Nasik, Pune

Work from Office

Naukri logo

Data Engineer–Azure Synapse & Data Pipelines Expert Experience Exp: 5 to 8 Years Location: Nashik/Pune (Work from office only) Design, develop, and maintain efficient data pipelines using Azure Synapse, Azure Data Factory(ADF)& Azure Databricks(ADB) Required Candidate profile experience with Azure Synapse Analytics, ADF, ADB, PySpark, and SQL Proven expertise in designing and optimizing complex data pipelines for high performance and reliability Experience with Data Lake

Posted 2 months ago

Apply

5 - 10 years

16 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

We have 2 requirements : 1 - Azure Engineer Specialist Mandate Skills - ADB, Pyspark , ADF , Delta Lake 2 - Azure Lead Mandate Skills - ADF, ADB , Syapse , Python , Erwin Also looking for someone who is handling experience in team leading. Data Engineer utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle the distribution of data types and volumes in support of data architecture design. A Senior Data Engineer designs and oversees the entire data infrastructure, data products, and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost-effective. Key Responsibilities: Design and oversee the entire data architecture strategy. Mentor junior data architects to ensure skill development in alignment with the team strategy. Design and implement complex scalable, high-performance data architectures that meet business requirements. Model data for optimal reuse, interoperability, security, and accessibility. Develop and maintain data flow diagrams and data dictionaries. Collaborate with stakeholders to understand data needs and translate them into technical solutions. Ensure data accessibility through a performant, cost-effective consumption layer that supports use by citizen developers, data scientists, AI, and application integration. Ensure data quality, integrity, and security across all data systems. Qualification: Experience in Erwin, Azure Synapse, Azure Databricks, Azure DevOps, SQL, Power BI, Spark, Python, and R. Ability to drive business results by building optimal cost data landscapes. Familiarity with Azure AI/ML Services, Azure Analytics: Event Hub, Azure Stream Analytics, Scripting: Ansible Experience with machine learning and advanced analytics. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Understanding of CI/CD pipelines and automated testing frameworks. Certifications such as AWS Certified Solutions Architect, IBM certified data architect, or similar are a plus.

Posted 2 months ago

Apply

7 - 12 years

16 - 27 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description Data Engineer We are seeking a highly skilled Data Engineer with extensive experience in Snowflake, Data Build Tool (dbt), Snaplogic, SQL Server, PostgreSQL, Azure Data Factory, and other ETL tools. The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python. A positive attitude and excellent teamwork skills are essential. Role & responsibilities Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, DBT, Snaplogic, and ETL tools. SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency. Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability. Database Management: Manage and maintain SQL Server and PostgreSQL databases. ETL Processes: Develop and manage ETL processes to support data warehousing and analytics. Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions. Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes. Troubleshooting: Identify and resolve data-related issues and discrepancies. Python Scripting: Utilize Python for data manipulation, automation, and integration tasks. Preferred candidate profile Proficiency in Snowflake, DBT, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory. Strong SQL skills with the ability to write and optimize complex queries. Knowledge of Python for data manipulation and automation. Knowledge of data governance frameworks and best practices Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Positive attitude and ability to work well in a team environment. Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus. Please forward your updated profiles to the below mentioned Email Address: divyateja.s@prudentconsulting.com

Posted 2 months ago

Apply

3 - 5 years

9 - 13 Lacs

Kota

Work from Office

Naukri logo

Job Description : - Proven experience in MS Azure data services - ADF, ADLS, Azure Databricks, Delta lake - Experience in Python & PySpark development - Proven experience SQL development preferably SQL Server but other DBMS may be fine - Good communication (verbal & written) & documentation skills - Good communication (verbal & written) & documentation skills - Self-starter, motivated and able to work independently while being a good team player - Good to have: knowledge on analytic models to consume unstructured and social data - Should have hands on experience in Azure Data bricks experience with Pyspark, - Strong Data warehouse knowledge, Strong SQL writing skills, Writing Store Procedure, Azure Data Factory & Azure datalake - Building a data warehouse solution in cloud in particular Azure.

Posted 2 months ago

Apply

3 - 8 years

15 - 30 Lacs

Chennai

Work from Office

Naukri logo

Job Title: Data Engineer Designation: Senior Associate Location: Chennai only ( Relocation cost would be bared for candidates relocating ) Experience Level: 3 -5 Years Job Summary: We are seeking an experienced Senior Data Engineer to join our dynamic team. In this role, you will be responsible for designing, implementing, and maintaining data infrastructure on Azure with extensive focus on Azure Databricks. You will work hand in hand with our analytics team to support data-driven decision making across different external clients in a variety of industries. SCOPE OF WORK Design, build, and maintain scalable data pipelines using Azure Data Factory (ADF), Fivetran and other Azure services. Administer, monitor, and troubleshoot SQLServer databases, ensuring high performance and availability. Develop and optimize SQL queries and stored procedures to support data transformation and retrieval. Implement and maintain data storage solutions in Azure, including Azure Databricks, Azure SQL Database, Azure Blob Storage, and Data Lakes. Collaborate with business analysts, clients and stakeholders to deliver insightful reports and dashboards using Power BI. Develop scripts to automate data processing tasks using languages such as Python, PowerShell, or similar. Ensure data security and compliance with industry standards and organizational policies. Stay updated with the latest technologies and trends in Azure cloud services and data engineering. Desired experience in healthcare data analytics, including familiarity with healthcare data models such as Encounter based models, or Claims focused models or Manufacturing data analytics or Utility Analytics IDEAL CANDIDATE PROFILE Bachelors degree in Computer Science, Engineering, Information Technology, or related field. At least 3-5 years of experience in data engineering with a strong focus on Microsoft Azure and Azure Databricks. Proven expertise in SQL Server database administration and development. Experience in building and optimizing data pipelines, architectures, and data sets on Azure. Experience with dbt and Fivetran Familiarity with Azure AI and LLM’s including Azure OpenAI Proficiency in Power BI for creating reports and dashboards. Strong scripting skills in Python, PowerShell, or other relevant languages. Familiarity with other Azure data services (e.g., Azure Synapse Analytics, Azure Blob..etc). Knowledge of data modeling, ETL processes, and data warehousing concepts. Excellent problem-solving skills and the ability to work independently or as part of a team. Thanks Aukshaya

Posted 2 months ago

Apply

3 - 5 years

9 - 13 Lacs

Ranchi

Work from Office

Naukri logo

Job Description : - Proven experience in MS Azure data services - ADF, ADLS, Azure Databricks, Delta lake - Experience in Python & PySpark development - Proven experience SQL development preferably SQL Server but other DBMS may be fine - Good communication (verbal & written) & documentation skills - Good communication (verbal & written) & documentation skills - Self-starter, motivated and able to work independently while being a good team player - Good to have: knowledge on analytic models to consume unstructured and social data - Should have hands on experience in Azure Data bricks experience with Pyspark, - Strong Data warehouse knowledge, Strong SQL writing skills, Writing Store Procedure, Azure Data Factory & Azure datalake - Building a data warehouse solution in cloud in particular Azure.

Posted 2 months ago

Apply

5 - 8 years

15 - 25 Lacs

Chennai, Pune, Mysore

Hybrid

Naukri logo

Must Have: ADF, Data bricks and Python Developer

Posted 2 months ago

Apply

5 - 7 years

7 - 10 Lacs

Pune

Work from Office

Naukri logo

Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Domain->Retail->Vendor Managed Inventory->Food & Beverages,Technology->Cloud Integration->Azure Data Factory (ADF),Technology->Data On Cloud - NoSQL->Amazon Dynamo DB,Technology->DevOps->Continuous integration - Mainframe Preferred Skills: Technology->Cloud Platform->AWS Database Technology->Cloud Integration->Azure Data Factory (ADF) Technology->DevOps->Continuous integration - Mainframe Technology->Cloud Platform->Azure Devops->Azure Pipelines Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements MCA,MTech,Bachelor of Engineering,BCA,BSc,BTech Service Line Enterprise Package Application Services

Posted 2 months ago

Apply

6 - 10 years

9 - 12 Lacs

Trivandrum

Hybrid

Naukri logo

Role & responsibilities Senior Data Engineer will be responsible for designing, implementing, and maintaining data solutions on the Microsoft Azure Data platform and SQL Server (SSIS, SSAS, UC4 Atomic) Collaborate with various stakeholders, and ensuring the efficient processing, storage, and retrieval of large volumes of data Technical Expertise and Responsibilities Design, build, and maintain scalable and reliable data pipelines. Should be able to design and build solutions in Azure data factory and Databricks to extract, transform and load data into different source and target systems. Should be able to design and build solutions in SSIS Should be able to analyze and understand the existing data landscape and provide recommendations/innovative ideas for rearchitecting / optimizing / streamlining to bring efficiency and scalability. Must be able to collaborate and effectively communicate with onshore counterparts to address technical gaps, requirement challenges, and other complex scenarios. Monitor and troubleshoot data systems to ensure high performance and reliability. Should be highly analytical and detail-oriented with extensive familiarity with database management principles. Optimize data processes for speed and efficiency. Ensure the data architecture supports business requirements and data governance policies. Define and execute the data engineering strategy in alignment with the companys goals. Integrate data from various sources, ensuring data quality and consistency. Stay updated with emerging technologies and industry trends. Understand the big picture business process utilizing deep knowledge in banking industry and translate them to data requirements. Enabling and running data migrations across different databases and different servers Perform thorough testing and validation to support the accuracy of data transformations and data verification used in machine learning models. Analyze data and different systems to define data requirements. Should be well versed with Data Structures & algorithms. Define data mapping working along with business and digital team and data team. Data pipeline maintenance/testing/performance validation Assemble large, complex data sets that meet functional / non-functional business requirements. Analyze and identify gaps on data needs and work with business and IT to bring in alignment on data needs. Troubleshoot and resolve technical issues as they arise. Optimize data flow and collection for cross-functional teams. Work closely with Data counterparts at onshore, product owners, and business stakeholders to understand data needs and strategies. Collaborate with IT and DevOps teams to ensure data infrastructure aligns with overall IT architecture. Implement best practices for data security and privacy. Drive continuous improvement initiatives within the data engineering function Optimize data flow and collection for cross-functional teams. Understand impact of data conversions as they pertain to servicing operations. Manage higher volume and more complex cases with accuracy and efficiency. Required Skill's: Design and develop warehouse solutions using Azure Synapse Analytics, ADLS, ADF, Databricks, Power BI, Azure Analysis Services Should be proficient in SSIS, SQL and Query optimization. Should have worked in onshore offshore model managing challenging scenarios. Expertise in working with large amounts of data (structured and unstructured), building data pipelines for ETL workloads and generate insights utilizing Data Science, Analytics. Expertise in Azure, AWS cloud services, and DevOps/CI/CD frameworks. Ability to work with ambiguity and vague requirements and transform them into deliverables. Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently. Drive automation efforts across the data analytics team utilizing Infrastructure as Code (IaC) using Terraform, Configuration Management, and Continuous Integration (CI) / Continuous Delivery (CD) tools such as Jenkins. Help build define architecture frameworks, best practices & processes. Collaborate on Data warehouse architecture and technical design discussions. Expertise in Azure Data factory and should be familiar with building pipelines for ETL projects. Expertise in SQL knowledge and experience working with relational databases. Expertise in Python and ETL projects Experience in data bricks will be of added advantage. Should have expertise in data life cycle, data ingestion, transformation, data loading, validation, and performance tuning.

Posted 2 months ago

Apply

7 - 12 years

15 - 22 Lacs

Chennai, Bengaluru, Kochi

Hybrid

Naukri logo

Role - Azure Data Engineer Skill - Azure Data Factory Experience - 7-12 years Location - Chennai, Kochi and Bangalore Responsibilities: Design, develop, and maintain robust data pipelines and ETL processes. Implement and optimize data storage solutions in data warehouses and data lakes. Collaborate with cross-functional teams to understand data requirements and deliver high-quality data solutions. Utilize Microsoft Azure tools for data integration, transformation, and analysis. Develop and maintain reports and dashboards using Power BI and other analytics tools. Ensure data integrity, consistency, and security across all data systems. Optimize database and query performance to support data-driven decision-making. Qualifications: 7-12 years of professional experience in data engineering or a related field. Profound expertise in SQL, T-SQL, database design, and data warehousing principles. Strong experience with Microsoft Azure tools, including SQL Azure, Azure Data Factory, Azure Databricks, and Azure Data Lake. Proficiency in Python, PySpark, and PySQL for data processing and analytics tasks. Experience with Power BI and other reporting and analytics tools. Demonstrated knowledge of OLAP, data warehouse design concepts, and performance optimizations in database and query processing. Excellent problem-solving, analytical, and communication skills. Interested Candidates can share their updated resume at megha.chattopadhyay@aspiresys.com

Posted 2 months ago

Apply

7 - 12 years

8 - 10 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Our client INFOSYS is looking for POWER BI DEVELOPER WITH ADF position with 7+ years of experience in INDIA location. CONTRACT TO HIRE AND WORK FROM OFFIC E JOB DESCRIPTION : Having Experience on Power BI Developer Must have experience on ADF(Azure Data Factory)

Posted 2 months ago

Apply

4 - 9 years

10 - 20 Lacs

Bengaluru

Hybrid

Naukri logo

Required Skills & Qualifications: • Good Communication skills and learning aptitude • Good understanding of Azure environment • Hands on Exp in Azure Data Lake, Azure Data Factory, Azure Synapse, Azure Databricks- • Must have hands on Apache Spark and Scala / Python programming, working with Delta Tables, Experience in Databricks is an added advantage. • Strong SQL Skills: Developing SQL Store Procedures, Functions, Dynamic SQL queries, Joins • Hands on experience in ingesting data from various data sources and data types & file types • Knowledge in Azure DevOps, understanding of build and release pipelines Good to have • Snowflake added advantage

Posted 2 months ago

Apply

6 - 11 years

15 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Description: Understand business requirements, data formats and actively provide inputs from Data perspective. Understand the underlying data and flow of data. Should have to build pipelines & dataflows on large volumes of data inbound every month. Implement modules that has security and authorization role -based frameworks. Requirements: Minimum 5+years of experience in SQL and ADF Expert level knowledge on Azure Data factory and fully hands on. Experience or knowledge with Azure Data Lake Expert level knowledge of SQL server DB & Datawarehouse Must have worked in ETL OR SSIS Should be expert in SQL programming, exposed in processing large volumes of data, able to analyze and understand complex data, optimizing queries and SQL objects , audit trail and error handling.

Posted 2 months ago

Apply

8 - 13 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

Job description Job Title: Azure Data Engineer Location: Bangalore-Hybrid Exp: 8+ Years Minimum Qualifications Degree in Analytics, Business Administration, Data Science or equivalent experience Creative problem solving skills Exemplary attention to detail, critical that you understand the data before finalizing a query Strong written & verbal communication skills Ensures queries are accurate (written correctly & provides the intended outcome) Skilled at creating documentation to run and troubleshoot the tools / reports you create Collaboration and networking ability 8+ years of experience using SQL & creating reports in Power BI Tools: SQL Snowflake Power BI Microsoft Excel Azure Data bricks, ADL,ADF DAX

Posted 2 months ago

Apply

4 - 6 years

7 - 15 Lacs

Pune, Delhi NCR, Bengaluru

Hybrid

Naukri logo

Exp: 4 to 6 years Job Location: Noida / Mumbai / Pune / Bangalore / Gurgaon / Kochi ( Hybrid work) Notice : Immediate to 30 days Skill set : ADF , Pyspark , SQL Role & responsibilities Key Responsibilities: • Develop scalable data pipelines using Azure Data Factory (ADF), Databricks, PySpark, and Delta Lake to support ML and AI workloads. • Optimize and transform large datasets for feature engineering, model training, and real-time AI inference. • Build and maintain lakehouse architecture using Azure Data Lake Storage (ADLS) & Delta Lake. • Work closely with ML engineers & Data Scientists to deliver high-quality, structured data for training Generative AI models. • Implement MLOps best practices for continuous data processing, versioning, and model retraining workflows. • Monitor & improve data quality using Azure Data Quality Services • Ensure cost-efficient data processing in Databricks using Photon, Delta Caching, and Auto-Scaling Clusters. • Secure data pipelines by implementing RBAC, encryption, and governance Required Skills & Experience: • 3+ years of experience in Data Engineering with Azure & Databricks. • Proficiency in PySpark, SQL, and Delta Lake for large-scale data transformations. • Strong experience with Azure Data Factory (ADF), Azure Synapse, and Event Hubs. • Hands-on experience in building feature stores for ML models. • Experience with ML model deployment and MLOps pipelines (MLflow, Kubernetes, or Azure ML) is a plus. • Good understanding of Generative AI concepts and handling unstructured data (text, images, video, embeddings). • Familiarity with Azure DevOps, CI/CD for data pipelines, and Infrastructure as Code (Terraform, Bicep). • Strong problem-solving, debugging, and performance optimization skills. Preferred candidate profile Interested candidates , kindly share updated resume at simpy.bagati@infogain.com

Posted 2 months ago

Apply

6 - 10 years

16 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

JD:- SAP HANA/SQL Developer (Working Experience of 6-8 Years) Analyze, plan, design, develop, and implement the SAP HANA solutions to meet strategic, usability, performance, reliability, control, and security requirements of Analytics reporting processes. Requires good knowledge in areas of Analytics, Data warehouse, reporting applications and ETL Processes. Must be innovative. Proficient with SQL Programming (preferably working with complex SQL data models, Stored Procedure programming, data loads etc..). Working experience on ETL Technologies, such as Azure Data Factory or SAP BODS or Informatica or SSIS. Responsibilities: Should be able to understand the functional requirements and appropriately convert them into Technical Design documents. Should be able to assist the team with his/her technical skills whenever an issue is encountered. Performing effort estimation for various implementation and enhancement activities Performing troubleshooting and problem resolution of any complex application built. Excellent written, verbal, listening, analytical, and communication skills are required. Highly self-motivated and directed, Experience in working in team-oriented, collaborative environment. Should take ownership of individual deliverables Work with team members to analyze, plan, design, develop, and implement solutions to meet strategic, usability, performance, reliability, control, and security requirements Support and coordinate the efforts of Subject Matter Experts, Development, Quality Assurance, Usability, Training, Transport Management, and other internal resources for the successful implementation of system enhancements and fixes Perform SAP HANA programming as required Troubleshoot SQL data models, procedures, views & indexes Create and maintain internal documentation and end-user training materials as needed. Provide input to standards and guidelines and implement best practices to enable consistency across all projects Participate in the continuous improvement processes as assigned Knowledge of Cloud technologies, such as Azure SQL, Azure Data Factory, is nice to have

Posted 2 months ago

Apply

5 - 8 years

9 - 19 Lacs

Pune, Gurgaon, Noida

Work from Office

Naukri logo

Dashboard and report development/enhancement, testing and deployment Data visualisation, data analysis and data modelling Requirements gathering, data analysis, data quality and data validation Continuous improvement, testing, report standardisation and documentation Troubleshoot report quality issues, ticket resolution and root cause analysis Power BI SQL/Stored Procedures Data Analysis Data Modeling (within PBI) Ticket Triaging Stakeholder engagement/ management

Posted 2 months ago

Apply

16 - 26 years

40 - 60 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

As a Synapse - Principal Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and an Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Collaborate with the clients to understand the overall requirements and create the robust, extensible architecture to meet the client/business requirements. Identify the right technical stack and tools that best meets the client requirements. Work with the client to define the scalable architecture. Design end to end solution along with data strategy, including standards, principles, data sources, storage, pipelines, data flow, and data security policies. Collaborate with data engineers, data scientists, and other stakeholders to execute the data strategy. Implement the Synapse best practices, data quality and data governance. Define the right data distribution/consumption pattern for downstream systems and consumers. Own end-to-end delivery of the project and design and develop reusable frameworks. Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc. Support business proposals by providing solution approaches, detailed estimations, technical insights and best practices. Guidementor team members, and create the technical artifacts. Demonstrate thought leadership. Job Requirement A total of 16+ years of professional experience, including a minimum of 5 years of experience specifically in Architect roles focusing on Analytics solutions. Additionally, a minimum of 3+ years of experience working with Cloud platforms, demonstrating familiarity with Public Cloud architectures. Experience in implementing Modern Data Platforms/Data Warehousing solutions covering all major data solutioning aspects like Data integration, harmonization, standardization, modelling, governance, lineage, cataloguing, Data sharing and reporting Should have a decent understanding and working knowledge of the ADF, Logic Apps, Dedicated SQL pools, Serverless SQL pool & Spark Pools services of Azure Synapse Analytics focussed on optimization , workload management , availability, security, observability and cost management strategies Good hands-on experience writing procedures & scripts using T-SQL, Python. Good understanding of RDBMS systems, distributed computing on cloud with hands-on experience on Data Modelling. Experience in large scale migration from on-prem to Azure cloud. Good understanding of Microsoft Fabric Excellent understanding of Database and Datawarehouse concepts Experience in working with Azure DevOps Excellent communication & interpersonal skills

Posted 2 months ago

Apply

7 - 9 years

9 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Cloud Developer (Azure)- AM - BLR - J48799 Working as Assistant Manager/Senior Consultant in this team, your responsibilities will include: Strong knowledge of Azure Data Factory, Azure SQL, Azure Function Apps, Connectors, Logic Apps, CI/CD pipelines. Experience in Power Platform (Power Apps, Power Automate), Python is a plus Perform software development, unit testing and debugging, which may include the development of new software, reuse of existing code, modification of existing programs, or integration of purchased solutions. Documents and demonstrates solutions by developing documentation and code comments. Actively participate in design reviews by providing creative and practical ideas and solutions in a teamwork environment Ability to work in a Team which practices SCRUM/AGILE methodologies Ability to work under guidance and should have passion to learn and adapt new technologies Good Analytical and Problem solving skills Excellent written and verbal communication skills Over and above-mentioned responsibilities, He/she should be able to manage the offshore team technically by providing technical guidance and working with onsite to understand the business problems and provide quality solutions and architecture Skills Required Expertise in Azure Data Factory, Azure SQL, Azure Function Apps, Connectors, Logic Apps, CI/CD pipelines Experience in Power Platform (Power Apps, Power Automate) is a plus Required Candidate profile Candidate Experience Should Be : 7 To 9 Candidate Degree Should Be : BE-Comp/IT,BE-Other,BTech-Comp/IT,BTech-Other,MBA,MCA

Posted 2 months ago

Apply

4 - 7 years

3 - 5 Lacs

Pune

Work from Office

Naukri logo

Position: SQL Developer Employment Type: Full Time Location: Pune, India Salary: TBC Work Experience: Applicant for this position should have 4+ years working as a SQL developer. Project Overview: The project will use a number of Microsoft SQL Server technologies and include development and maintenance of reports, APIs and other integrations with external financial systems. The successful applicant will liaise with other members of the team and will be expected to work on projects where they are the sole developer as well as part of a team on larger projects. The applicant will report to the SQL Development Manager Job Description: Ability to understand requirements clearly and communicate technical ideas to both technical stakeholders and business end users. Investigate and resolve issues quickly. Communication with end users. Working closely with other team members to understand business requirements. Complete structure analysis and systematic testing of the data. Skills: Microsoft SQL Server 2016 2022. T-SQL programming (4+ years) experience. Query/Stored Procedure performance tuning. SQL Server Integration Service. SQL Server Reporting Services. Experience in database design. Experience with source control. Knowledge of software engineering life cycle. Previous experience in designing, developing, testing, implementing and supporting software. 3rd Level IT Qualification. SQL MSCA or MSCE preferable. Knowledge of data technologies such as SnowFlake, Airflow, ADF desirable Other skills Ability to work on own initiative and as part of a team. Excellent time management and decision making skills. Excellent communication skills in both English written and verbal. Background in the financial industry preferable. Academic Qualification: Any graduation or post graduate. Any specialization in IT.

Posted 2 months ago

Apply

6 - 7 years

1 - 2 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title : Oracle ADF Consultant Location State : Telangana Location City : Hyderabad Experience Required : 6 to 8 Year(s) Shift: Rotational Work Mode: Onsite Openings: 3 Interested candidate share there updated resume sangeeta.t@varite.com For more information contact sangeeta @ 8929376486 Company Name: VARITE INDIA PRIVATE LIMITED About The Client: We are a technology consulting and services company with 11, 500+ associates in 30+ global locations. More than 145 leading enterprises depend on our expertise to be more disruptive, agile, and competitive. We focus on conceptualizing, designing, engineering, marketing, and managing digital products and experiences for high-growth companies looking to disrupt through innovation and velocity. About The Job: Experienced Oracle WebCenter Portal Lead Developer with a minimum of 6-7 years of hands-on experience in ADF, JavaScript, and REST API. The role involves client-facing responsibilities and requirement gathering for portal development projects.Key Responsibilities:Lead Oracle WebCenter Portal projects.Develop ADF applications and integrate REST APIs.Extensive knowledge in Oracle Custom Components. Hands-on knowledge in integration with REST APIs, RIDC, IDOC SCRIPTGather and analyze client requirements.Provide technical leadership and mentorship. Preferred Skills:Knowledge of Oracle WebLogic Server and Oracle Database.Knowledge of Oracle IDCS (Oracle Identity Cloud Services) is an added advantage Knowledge of Oracle WebCenter Content is an added advantage Good to have certifications on OIC, WebCenter, JAVA, etc.Knowledge of sql is required. Essential Job Functions: Troubleshoot and resolve technical issues.Strong client-facing and requirement gathering skills.Excellent problem-solving and communication skills. Qualifications: BE/ B.TECH/ MCA/ BCA How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000 About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services.

Posted 2 months ago

Apply

5 - 10 years

3 - 8 Lacs

Chennai, Pune, Noida

Work from Office

Naukri logo

Candidates can share their resumes at dee[ali.rawat@rsystems.com Data Engineer Developer with 6+ years relevant working experience in following skills Azure data Factory Azure Databricks Microsoft fabric Complex SQL query development DAX Data Warehouse design and development Experience of handling complex data flow, pipeline, and API call requests Excellent communication skill

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies