Home
Jobs

982 Adf Jobs - Page 29

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Role Description Job Summary: We are seeking an experienced ADF Developer to design, build, and maintain data integration solutions using Azure Data Factory with exposure to Azure Databricks (ADB) . The ideal candidate will have hands-on expertise in ETL pipelines , data engineering , and Azure cloud services to support enterprise data initiatives. Key Responsibilities Design and develop scalable ETL pipelines using ADF. Integrate ADB for advanced data transformation tasks. Optimize and troubleshoot ADF pipelines and queries (SQL, Python, Scala). Implement robust data validation, error handling, and performance tuning. Collaborate with data architects, analysts, and DevOps teams. Maintain technical documentation and support ongoing solution improvements. Required Qualifications Bachelor’s/Master’s in Computer Science or related field. 2+ years of hands-on ADF experience. Strong skills in Python, SQL, and/or Scala. Familiarity with ADB and Azure cloud services. Solid knowledge of ETL, data warehousing, and performance optimization. Preferred Microsoft Azure Data Engineer certification. Exposure to Spark, Hadoop, Git, Agile practices, and domain-specific projects (finance, healthcare, retail). Understanding of data governance and compliance. Skills Adf,Adb,Datastage Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Role Description Job Summary: We are seeking a seasoned ADF Developer to design, implement, and optimize data integration solutions using Azure Data Factory (ADF) as the primary tool, with added experience in Azure Databricks (ADB) as a plus. The ideal candidate has strong ETL, data engineering, and cloud expertise within the Azure ecosystem . Key Responsibilities Design and develop ETL pipelines using ADF; integrate ADB for complex transformations. Write optimized Python, SQL, or Scala code for large-scale data processing. Configure ADF pipelines, datasets, linked services, and triggers. Ensure high data quality through robust validation, testing, and error handling. Optimize pipeline and query performance; troubleshoot issues proactively. Collaborate with data architects, analysts, and DevOps teams. Maintain clear documentation of pipeline logic and data flows. Support users and ensure minimal disruption to business operations. Required Skills 7+ years of hands-on ADF experience. Strong in Python, SQL, and/or Scala. Experience with ETL, data modeling, and Azure cloud tools. Familiarity with Azure Databricks. Excellent problem-solving and communication skills. Preferred Microsoft Azure Data Engineer Associate certification. Experience with Spark, Hadoop, Git, Agile, and data governance. Domain exposure: finance, healthcare, or retail. Skills Adf,Adb,Datastage Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

About The Job About Beyond Key We are a Microsoft Gold Partner and a Great Place to Work-certified company. "Happy Team Members, Happy Clients" is a principle we hold dear. We are an international IT consulting and software services firm committed to providing. Cutting-edge services and products that satisfy our clients' global needs. Our company was established in 2005, and since then we've expanded our team by including more than 350+ Talented skilled software professionals. Our clients come from the United States, Canada, Europe, Australia, the Middle East, and India, and we create and design IT solutions for them. If you need any more details, you can get them at https://www.beyondkey.com/about. Job Title: Senior Data Engineer ( Power BI, ADF & MS Fabric) Experience: 7+ years Location: Indore / Pune (Hybrid/Onsite) Job Type: Full-time Open Position : 1 Key Responsibilities Design, develop, and maintain interactive Power BI dashboards & reports with advanced DAX, Power Query, and custom visuals. Build and optimize end-to-end data solutions using Microsoft Fabric (OneLake, Lakehouse, Data Warehouse). Develop and automate ETL/ELT pipelines using Azure Data Factory (ADF) and Fabric Data Pipelines. Architect and manage modern data warehousing solutions (Star/Snowflake Schema) using Fabric Warehouse, Azure Synapse, or SQL Server. Implement data modeling, performance tuning, and optimization for large-scale datasets. Collaborate with business teams to translate requirements into scalable Fabric-based analytics solutions. Ensure data governance, security, and compliance across BI platforms. Mentor junior team members on Fabric, Power BI, and cloud data best practices. Required Skills & Qualifications 7+ years of hands-on experience in Power BI, SQL, Data Warehousing, and ETL/ELT. Strong expertise in Microsoft Fabric (Lakehouse, Warehouse, ETL workflows, Delta Lake). Proficient in Azure Data Factory (ADF) for orchestration and data integration. Advanced SQL (query optimization, stored procedures, partitioning). Experience with data warehousing (dimensional modeling, SCD, fact/dimension tables). Knowledge of Power BI Premium/Fabric capacity, deployment pipelines, and DAX patterns. Familiarity with Databricks, PySpark, or Python (for advanced analytics) is a plus. Strong problem-solving and stakeholder management skills. Preferred Qualifications Microsoft Certifications (PL-300: Power BI, DP-600: Fabric Analytics Engineer). Experience with Azure DevOps (CI/CD for Fabric/Power BI deployments). Domain knowledge in BFSI, Retail, or Manufacturing. Share with someone awesome View all job openings Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

JD for a Databricks Data Engineer Key Responsibilities: Design, develop, and maintain high-performance data pipelines using Databricks and Apache Spark. Implement medallion architecture (Bronze, Silver, Gold layers) for efficient data processing. Optimize Delta Lake tables, partitioning, Z-ordering, and performance tuning in Databricks. Develop ETL/ELT processes using PySpark, SQL, and Databricks Workflows. Manage Databricks clusters, jobs, and notebooks for batch and real-time data processing. Work with Azure Data Lake, AWS S3, or GCP Cloud Storage for data ingestion and storage. Implement CI/CD pipelines for Databricks jobs and notebooks using DevOps tools. Monitor and troubleshoot performance bottlenecks, cluster optimization, and cost management. Ensure data quality, governance, and security using Unity Catalog, ACLs, and encryption. Collaborate with Data Scientists, Analysts, and Business Teams to deliver insights. Required Skills & Experience: 5+ years of hands-on experience in Databricks, Apache Spark, and Delta Lake. Strong SQL, PySpark, and Python programming skills. Experience in Azure Data Factory (ADF), AWS Glue, or GCP Dataflow. Expertise in performance tuning, indexing, caching, and parallel processing. Hands-on experience with Lakehouse architecture and Databricks SQL. Strong understanding of data governance, lineage, and cataloging (e.g., Unity Catalog). Experience with CI/CD pipelines (Azure DevOps, GitHub Actions, or Jenkins). Familiarity with Airflow, Databricks Workflows, or orchestration tools. Strong problem-solving skills with experience in troubleshooting Spark jobs. Nice to Have: Hands-on experience with Kafka, Event Hubs, or real-time streaming in Databricks. Certifications in Databricks, Azure, AWS, or GCP. Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

About Gartner IT Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About The Role Data warehousing engineer with technical expertise, capable of collaborating with the team to create a Data Platform Strategy and implement the solution. What You’ll Do Participate in design and implementation of the Data Warehousing Solution Participate in the end-to-end delivery of solutions from gathering requirements, to implementation, testing, and continuous improvement post roll out using Agile Scrum methodologies. What You’ll Need 2-4 years of experience in software programming and/or data warehousing, in an Agile Scrum environment. Must Have Strong experience in SQL, ADF and Synapse/Databricks. ETL process design including techniques for addressing slowly changing dimensions, differential fact-journaling (i.e., storage optimization for fact data), semi-additive measures and related concerns, and rolldown distributions. SQL query optimization Who You Are Bachelor’s degree in computer science or information systems, or equivalent experience in the field of software development Effective time management skills and ability to meet deadlines. Delivering project work on-time within budget with high quality. Excellent communications skills interacting with technical and business audience’s. Excellent organization, multitasking, and prioritization skills. Must possess a willingness and aptitude to embrace new technologies/ideas and master concepts rapidly. Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:99949 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 Lacs

Pune, Maharashtra

On-site

Indeed logo

Job details Employment Type: Full-Time Location: Pune, Maharashtra, India Job Category: Information Systems Job Number: WD30237233 Job Description At Johnson Controls, we’re shaping the future to create a world that’s safe, comfortable, and sustainable. Our global team creates innovative, integrated solutions that make cities more Software Developer – Data Solutions (ETL) Job Description: Johnson Controls is seeking an experienced ETL Developer responsible for designing, implementing, and managing ETL processes. The successful candidate will work closely with data architects, business analysts, and stakeholders to ensure data is extracted, transformed, and loaded accurately and efficiently for reporting and analytics purposes. Key Responsibilities o Design, develop, and implement ETL processes to extract data from various sources o Transform data to meet business requirements and load it into data warehouses or databases o Optimize ETL processes for performance and reliability o Collaborate with data architects and analysts to define data requirements and ensure data quality o Monitor ETL jobs and resolve issues as they arise o Create and maintain documentation of ETL processes and workflows o Participate in data modeling and database design Qualifications o Bachelor’s degree in computer science, Information Technology, or a related field o 3 to 5 years of experience as an ETL Developer or similar role o Strong knowledge of ETL tools – ADF, Synapse. Snowflake experience is mandatory. Multi cloud experience is a plus. o Proficient in SQL for data manipulation and querying o Experience with data warehousing concepts and methodologies o Knowledge of scripting languages (e.g., Python, Shell) is a plus o Excellent problem-solving skills and attention to detail o Strong communication skills to collaborate with technical and non-technical stakeholders o Candidates should be flexible / willing to work across this delivery landscape which includes and not limited to Agile Applications Development, Support and Deployment. o Expert level experience with Azure Data Lake, Azure Data Factory, Synapse, Azure Blob, Azure Storage Explorer, snowflake, Snowpark. What we offer Competitive salary and a comprehensive benefits package, including health, dental, and retirement plans. Opportunities for continuous professional development, training programs, and career advancement within the company. A collaborative, innovative, and inclusive work environment that values diversity and encourages creative problem-solving.

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 Lacs

Pune, Maharashtra

On-site

Indeed logo

Job details Employment Type: Full-Time Location: Pune, Maharashtra, India Job Category: Information Systems Job Number: WD30237243 Job Description At Johnson Controls, we're shaping the future to create a world that's safe, comfortable, and sustainable. Join us and be part of a team that prioritizes innovation and customer satisfaction. What you will do: o Design, develop, and implement ETL processes to extract data from various sources o Transform data to meet business requirements and load it into data warehouses or databases o Optimize ETL processes for performance and reliability o Collaborate with data architects and analysts to define data requirements and ensure data quality o Monitor ETL jobs and resolve issues as they arise o Create and maintain documentation of ETL processes and workflows o Participate in data modeling and database design requirements and provide appropriate solutions. What we look for: Required: Bachelor’s degree in computer science, Information Technology, or a related field o 3 to 5 years of experience as an ETL Developer or similar role o Strong knowledge of ETL tools – ADF, Synapse. Snowflake experience is mandatory. Multi cloud experience is a plus. o Proficient in SQL for data manipulation and querying o Experience with data warehousing concepts and methodologies o Knowledge of scripting languages (e.g., Python, Shell) is a plus o Excellent problem-solving skills and attention to detail o Strong communication skills to collaborate with technical and non-technical stakeholders o Candidates should be flexible / willing to work across this delivery landscape which includes and not limited to Agile Applications Development, Support and Deployment. o Expert level experience with Azure Data Lake, Azure Data Factory, Synapse, Azure Blob, Azure Storage Explorer, snowflake, Snowpark. What we offer: Competitive salary and a comprehensive benefits package, including health, dental, and retirement plans. Opportunities for continuous professional development, training programs, and career advancement within the company. A collaborative, innovative, and inclusive work environment that values diversity and encourages creative problem-solving.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Skills: ORACLE Cc&B, Oracle Cloud, JAVA, PL/SQL, ORACLE ADF, ORACLE JET, Greetings from Colan Infotech!!! Job Title: Oracle CC&B Developer & Administrator (OCI) Location: Remote Department: IT / Enterprise Applications Job Summary We are looking for a highly skilled Oracle Customer Care & Billing (CC&B) Developer & Administrator with experience managing CC&B on Oracle Cloud Infrastructure (OCI). This role is critical to supporting and enhancing our utility billing platform through custom development, system upgrades, issue resolution, and infrastructure management. The ideal candidate is technically strong, detail-oriented, and experienced in both back-end and front-end CC&B development. Key Responsibilities Development & Customization Design and develop enhancements and custom modules for Oracle CC&B using Java, PL/SQL, Oracle ADF, and Oracle JET. Implement business rules, workflows, batch processes, and UI changes based on stakeholder requirements. Build RESTful APIs and integrations with internal and third-party systems (e.g., MDM, GIS, payment gateways). Upgrades & Maintenance Lead full lifecycle CC&B upgrades, including planning, testing, migration, and production deployment. Apply and test Oracle patches and interim fixes; resolve any post-patch issues. OCI Administration Manage CC&B environments hosted on Oracle Cloud Infrastructure (OCI) including Compute, Autonomous Database, Load Balancers, and Object Storage. Configure and monitor system performance using Oracle Enterprise Manager (OEM). Implement backup, recovery, and high-availability strategies aligned with security best practices. Support & Issue Resolution Provide daily operational support and issue resolution for CC&B application and infrastructure. Perform root cause analysis and deliver long-term fixes for recurring issues. Monitor, tune, and optimize system performance (JVM, SQL, WebLogic). Documentation & Collaboration Maintain detailed documentation including technical specs, runbooks, and support procedures. Collaborate with QA, infrastructure, and business teams to ensure smooth operations and releases. Use Bitbucket for version control and code collaboration. Required Qualifications Bachelor's degree in Computer Science, Engineering, or a related field. 5+ years of hands-on experience with Oracle CC&B development and administration. Proven experience with CC&B upgrades, patching, and environment management. Strong development skills in Java (8+), PL/SQL, Oracle ADF, and Oracle JET. Solid experience with OCI components including Compute, Autonomous Database, IAM, and networking. Proficiency with Oracle Enterprise Manager (OEM) for monitoring and diagnostics. Experience using Bitbucket or similar version control platforms. Strong problem-solving and communication skills. Ability to work both independently and as part of a cross-functional team. Preferred Qualifications Experience with Oracle SOA Suite or Oracle Integration Cloud. Knowledge of utility billing processes and customer service workflows. Experience working in agile or hybrid project environments. Interested candidates send your updated resume to kumudha.r@colanonline.com Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Role : Azure Data Engineer Experience : Minimum 3-5 years Location : Spaze ITech Park, Sector-49, Gurugram Working Days : Monday to Friday (9 : 00 Am- 6 : 00 Pm) Joining : < 15 days About Us Panamoure is UK based group with offshore centre in Gurgaon, India. We are known to be the ultimate Business and Technology Change partner for our clients including PE groups and ambitious mid-market businesses.Panamoure is a fast paced and dynamic management consultancy delivering Business and Technology change services to the UKs fastest growing companies. Our ability to deliver exceptional quality to our clients has seen us grow rapidly over the last 36 months and we have ambitious plans to scale substantially further moving forward. As part of this growth we are looking to expand both our UK and India team with bright, ambitious and talented individuals that want to learn and grow with the business. Primary Skills The Azure Data Engineer will be responsible for developing, maintaining, and optimizing data pipelines and SQL databases using Azure Data Factory (ADF), Microsoft Fabrics and other Azure services. The role requires expertise in SQL Server, ETL/ELT processes, and data modeling to support business intelligence and operational applications. The ideal candidate will collaborate with cross-functional teams to deliver reliable, scalable, and high-performing data solutions. Key Responsibilities Design, develop, and manage SQL databases, tables, stored procedures, and T-SQL queries. Develop and maintain Azure Data Factory (ADF) pipelines to automate data ingestion, transformation, and integration. Build and optimize ETL/ELT processes to transfer data between Azure Data Lake, SQL Server, and other systems. Design and implement Microsoft Fabric Lake houses for structured and unstructured data storage. Build scalable ETL/ELT pipelines to move and transform data across Azure Data Lake, SQL Server, and external data sources. Develop and implement data modeling strategies using star schema, snowflake schema, and dimensional models to support analytics use cases. Integrate Azure Data Lake Storage (ADLS) with Microsoft Fabric for scalable, secure, and cost-effective data storage. Monitor, troubleshoot, and optimize data pipelines using Azure Monitor, Log Analytics, and Fabric Monitoring capabilities. Ensure data integrity, consistency, and security following data governance frameworks such as Azure Purview. Collaborate with DevOps teams to implement CI/CD pipelines for automated data pipeline deployment. Utilize Azure Monitor, Log Analytics, and Application Insights for pipeline monitoring and performance optimization. Stay updated on Azure Data Services and Microsoft Fabric innovations, recommending enhancements for performance and scalability. Requirements 4+ years of experience in data engineering with strong expertise in SQL development. Proficiency in SQL Server, T-SQL, and query optimization techniques. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, and Azure SQL Database. Solid understanding of ETL/ELT processes, data integration patterns, and data transformation. Practical experience with Microsoft Fabric components : Fabric Dataflows for self-service data preparation. Fabric Lake houses for unified data storage. Fabric Synapse Real-Time Analytics for streaming data insights. Fabric Direct Lake mode with Power BI for optimized performance. Strong understanding of Azure Data Lake Storage (ADLS) for efficient data management. Proficiency in Python or Scala for data transformation tasks. Experience with Azure DevOps, Git, and CI/CD pipeline automation. Knowledge of data governance practices, including data lineage, sensitivity labels, and RBAC. Experience with Infrastructure-as-Code (IaC) using Terraform or ARM templates. Understanding of data security protocols like data encryption and network security groups (NSGs). Familiarity with streaming services like Azure Event Hub or Kafka is a plus. Excellent problem-solving, communication, and team collaboration skills. Azure Data Engineer Associate (DP-203) and Microsoft Fabric Analytics certifications are desirable. What We Offer Opportunity to work with modern data architectures and Microsoft Fabric innovations. Competitive salary and benefits package, tailored to experience and qualifications. Opportunities for professional growth and development in a supportive and collaborative environment. A culture that values diversity, creativity, and a commitment to excellence. Benefits And Perks Provident Fund Health Insurance Flexible Timing Providing office Lunch How To Apply Interested candidates should submit their resume and a cover letter detailing their experience with Data Engineer experience, SQL expertise and familiarity with Microsoft Fabrics to hr@panamoure.com We look forward to adding a skilled Azure Data Engineer to our team! (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

India

Remote

Linkedin logo

Hi, Please go through the below requirements and let me know your interest and forward your resume along with your contact information to raja@covetitinc.com Role : Data Engineer Location : Remote JOB PURPOSE This position will help design, develop and provide operational support for data integration/ ETL projects and activities. He or She will also be required to guide/ mentor other data engineers, coordinate / assign / oversee tasks related to ETL projects , work with functional analysts, end users and other BI team members to design effective ETL solutions/ data integration pipelines. ESSENTIAL FUNCTIONS AND RESPONSIBILITIES The following are the essential functions of this position. This position may be responsible for performing additional duties and tasks as needed and assigned. Technical design, development, testing, documentation of Data Warehouse / ETL projects Perform data profiling and logical / physical data modelling to build new ETL designs and solutions Develop, implement and deploy ETL solutions to update data warehouse and datamarts Maintain quality control, document technical specs and unit testing to ensure accuracy and quality of BI data Implement, stabilize and establish Dev Ops process for version control and deployment from non prod to prod environments Troubleshoot, debug and diagnose ETL issues Provide production support and work with other IT team members and end users to resolve data refresh issues – provide off hours operational support as needed Performance tuning and enhancement of SQL and ETL processes and prepare related technical documentation Work with Offshore team to coordinate development work and operational support Keep abreast of latest ETL technologies and plan effective use Be key player in planning migration of our EDW system to Modern global data warehouse architecture Assessment and implementation new EDW/ Cloud technologies to help evolve EDW architecture to efficiency and performance. Communicate very clearly and professionally with users, peers, and all levels of management. The communication forms include written and verbal methods Lead ETL tasks and activities related to BI projects, assign/ coordinate/ follow up on activities to meet ETL project timelines. Follow through and ensure proper closure of service request issues Help with AI/ ML projects as assigned Perform code reviews on ETL/ report changes where appropriate Coordinate with the DBA team on migration, configuration, tuning of ETL codes Act as mentor for other data engineers in the BI Team. Adhere to the processes and work policies defined by management Perform other duties as needed MINIMUM QUALIFICATIONS The requirements listed below are representative of the education, knowledge, skill and/or ability required for this position. Education/Certifications : Requires minimum of 8 years of related experience with a Bachelor’s degree in computer science, MIS, Data science or related field; or 6 years and a Master’s degree Experience, Skills, Knowledge and/or Abilities : Understanding of ERP business processes (Order to Cash, Procure to Pay , Record to report etc), data warehouse and BI concepts and ability to apply educational and practical experience to improvise business intelligence applications and provide simplified and standardized solutions to achieve the business objectives. Expert knowledge of data warehouse architecture – well versed with Modern Data warehouse concepts, EDW & Data Lake/ Cloud architecture Expertise in dimensional modeling, star schema designs including best practices for use of indexes, partitioning, and data loading. Advanced experience in SQL, writing Stored procedures and tuning SQL, preferably using Oracle PL/SQL Strong experience with Data integration tool using ADF ( Azure Data Factory) Well versed with database administration tasks and working with DBAs to monitor and resolve SQL / ETL issues and performance tuning Experience with Dev Ops process in ADF, preferably using GitHub. Experience in other version control tools helpful. Experience in trouble shooting data warehouse refresh issues and BI reports data validation with source systems. Excellent communication skills. Ability to organize and handle multiple tasks simultaneously. Ability to mentor/ coordinate activities for other data engineers as needed. PREFERRED QUALIFICATIONS The education, knowledge, skills and/or abilities listed below are preferred qualifications in addition to the minimum qualifications stated above. Additional Experience, Skills, Knowledge and/or Abilities : Preferred with experience working with Oracle EBS or any major ERP systems like SAP Preferred with experience use of AI/ ML – Experience in R, Python, Pyspark a plus Preferred with experience on Cloud EDW technologies like Databricks, Snowflake, Synapse Preferred experience with Microsoft Fabric, Data Lakehouse concepts and related reporting capabilities PHYSICAL REQUIREMENTS / ADVERSE WORKING CONDITIONS The physical requirements listed in this section include, but are not limited, to the motor/physical abilities, skills, and/or demands required of the position in order to successfully undertake the essential duties and responsibilities of this position. In accordance with the Americans with Disabilities Act (ADA), reasonable accommodations may be made to allow qualified individuals with a disability to perform the essential functions and responsibilities of the position. No additional physical requirements or essential functions for this position. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We are seeking an experienced and strategic Data to design, build, and optimize scalable, secure, and high-performance data solutions. You will play a pivotal role in shaping our data infrastructure, working with technologies such as Databricks, Azure Data Factory, Unity Catalog , and Spark , while aligning with best practices in data governance, pipeline automation , and performance optimization . Key Responsibilities: • Design and develop scalable data pipelines using Databricks and Medallion (Bronze, Silver, Gold layers). • Architect and implement data governance frameworks using Unity Catalog and related tools. • Write efficient PySpark and SQL code for data transformation, cleansing, and enrichment. • Build and manage data workflows in Azure Data Factory (ADF) including triggers, linked services, and integration runtimes. • Optimize queries and data structures for performance and cost-efficiency . • Develop and maintain CI/CD pipelines using GitHub for automated deployment and version control. • Collaborate with cross-functional teams to define data strategies and drive data quality initiatives. • Implement best practices for DevOps, CI/CD , and infrastructure-as-code in data engineering. • Troubleshoot and resolve performance bottlenecks across Spark, ADF, and Databricks pipelines. • Maintain comprehensive documentation of architecture, processes, and workflows . Requirements: • Bachelor’s or master’s degree in computer science, Information Systems, or related field. • Proven experience as a Data Architect or Senior Data Engineer. • Strong knowledge of Databricks , Azure Data Factory , Spark (PySpark) , and SQL . • Hands-on experience with data governance , security frameworks , and catalog management . • Proficiency in cloud platforms (preferably Azure). • Experience with CI/CD tools and version control systems like GitHub. • Strong communication and collaboration skills. Show more Show less

Posted 3 weeks ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Mumbai

Work from Office

Naukri logo

#JobOpening Data Engineer (Contract | 6 Months) Location: Hyderabad | Chennai | Remote Flexibility Possible Type: Contract | Duration: 6 Months We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. #KeyResponsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes #RequiredSkills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities #NiceToHave Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS) #ContractDetails Role: Data Engineer Contract Duration: 6 Months Location Options: Hyderabad / Chennai (Remote flexibility available)

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Manager - MSM (Microsoft Sustainability Manager) Architect As an Architect on the GDS Consulting team within the Digital Engineering team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Oversees the design and deployment of the technical architecture, ensuring the appropriate expectations, principles, structures, tools, and responsibilities are in place to deliver excellence and risks are identified, managed, and mitigated. Analyse the chosen technologies against the implied target state and leverages good operational knowledge to identify technical and business gaps. Provides innovative and practical designs for the design and integration of new and existing solutions, which could include solutions for one or more functions of the enterprise, applying advanced technical capabilities. Collaborate with Service Lines, Sectors, Managed Services, Client Technology, Alliances and others to drive an integrated solution development and activation plan. Create sales and delivery collateral, online knowledge communities and support resources (e.g., client meeting decks, methods, delivery toolkits) with subject matter experts. Acts as an intermediary between the business / client community and the technical community, working with the business to understand and solve complex problems, presenting solutions and options in a simplified manner for clients / business. Microsoft Sustainability Manager configuration and customization: Analyse client needs and translate them into comprehensive MSM and Azure cloud solutions for managing emissions, waste, water, and other sustainability metrics. Configure and customize Microsoft Sustainability Manager to meet our specific data needs and reporting requirements. Develop automation routines and workflows for data ingestion, processing, and transformation. Integrate Sustainability Manager with other relevant data platforms and tools. Stay up to date on evolving ESG regulations, frameworks, and reporting standards. Power BI skills: Develop insightful dashboards and reports using Power BI to visualize and analyse key ESG metrics. Collaborate with stakeholders to identify data and reporting needs. Develop interactive reports and storytelling narratives to effectively communicate ESG performance. Designing and implementing data models: Lead the design and development of a robust data model to capture and integrate ESG data from various sources (internal systems, external datasets, etc.). Ensure the data model aligns with relevant ESG frameworks and reporting standards. Create clear documentation and maintain data lineage for transparency and traceability. Analyse and interpret large datasets relating to environmental, social, and governance performance. KPI (Key Performance Indicators) modelling and analysis: Define and develop relevant KPIs for tracking progress towards our ESG goals. Perform data analysis to identify trends, patterns, and insights related to ESG performance. Provide data-driven recommendations for improving our ESG footprint and decision-making. To qualify for the role, you must have: A bachelor's or master's degree. A minimum of 10-14 years of experience, preferably background in a professional services firm. 3+ years of experience in data architecture or analytics, preferably in the sustainability or ESG domain. Subject matter expertise in sustainability and relevant experience preferred (across any industry or competency) Experience managing large complex change management programs with multiple global stakeholders (required). Strong knowledge of Power Platform (Core), Power Apps (Canvas & MD), Power Automate. At least 6+ years of relevant experience on Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Power Portals/ Power Pages), Dynamics CRM / 365. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions etc. Able to effectively communicate with and manage diverse stakeholders across the business and enabling functions. Prior experience in go-to-market efforts Strong understanding of data modelling concepts and methodologies. Proven experience with Microsoft Azure and Power BI, including advanced functions and DAX scripting. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

7.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Manager - MSM (Microsoft Sustainability Manager) Architect As an Architect on the GDS Consulting team within the Digital Engineering team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Oversees the design and deployment of the technical architecture, ensuring the appropriate expectations, principles, structures, tools, and responsibilities are in place to deliver excellence and risks are identified, managed, and mitigated. Analyse the chosen technologies against the implied target state and leverages good operational knowledge to identify technical and business gaps. Provides innovative and practical designs for the design and integration of new and existing solutions, which could include solutions for one or more functions of the enterprise, applying advanced technical capabilities. Collaborate with Service Lines, Sectors, Managed Services, Client Technology, Alliances and others to drive an integrated solution development and activation plan. Create sales and delivery collateral, online knowledge communities and support resources (e.g., client meeting decks, methods, delivery toolkits) with subject matter experts. Acts as an intermediary between the business / client community and the technical community, working with the business to understand and solve complex problems, presenting solutions and options in a simplified manner for clients / business. Microsoft Sustainability Manager configuration and customization: Analyse client needs and translate them into comprehensive MSM and Azure cloud solutions for managing emissions, waste, water, and other sustainability metrics. Configure and customize Microsoft Sustainability Manager to meet our specific data needs and reporting requirements. Develop automation routines and workflows for data ingestion, processing, and transformation. Integrate Sustainability Manager with other relevant data platforms and tools. Stay up to date on evolving ESG regulations, frameworks, and reporting standards. Power BI skills: Develop insightful dashboards and reports using Power BI to visualize and analyse key ESG metrics. Collaborate with stakeholders to identify data and reporting needs. Develop interactive reports and storytelling narratives to effectively communicate ESG performance. Designing and implementing data models: Lead the design and development of a robust data model to capture and integrate ESG data from various sources (internal systems, external datasets, etc.). Ensure the data model aligns with relevant ESG frameworks and reporting standards. Create clear documentation and maintain data lineage for transparency and traceability. Analyse and interpret large datasets relating to environmental, social, and governance performance. KPI (Key Performance Indicators) modelling and analysis: Define and develop relevant KPIs for tracking progress towards our ESG goals. Perform data analysis to identify trends, patterns, and insights related to ESG performance. Provide data-driven recommendations for improving our ESG footprint and decision-making. To qualify for the role, you must have: A bachelor's or master's degree. A minimum of 10-14 years of experience, preferably background in a professional services firm. 3+ years of experience in data architecture or analytics, preferably in the sustainability or ESG domain. Subject matter expertise in sustainability and relevant experience preferred (across any industry or competency) Experience managing large complex change management programs with multiple global stakeholders (required). Strong knowledge of Power Platform (Core), Power Apps (Canvas & MD), Power Automate. At least 6+ years of relevant experience on Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Power Portals/ Power Pages), Dynamics CRM / 365. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions etc. Able to effectively communicate with and manage diverse stakeholders across the business and enabling functions. Prior experience in go-to-market efforts Strong understanding of data modelling concepts and methodologies. Proven experience with Microsoft Azure and Power BI, including advanced functions and DAX scripting. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

7.0 - 10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

7.0 - 10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Manager - MSM (Microsoft Sustainability Manager) Architect As an Architect on the GDS Consulting team within the Digital Engineering team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Oversees the design and deployment of the technical architecture, ensuring the appropriate expectations, principles, structures, tools, and responsibilities are in place to deliver excellence and risks are identified, managed, and mitigated. Analyse the chosen technologies against the implied target state and leverages good operational knowledge to identify technical and business gaps. Provides innovative and practical designs for the design and integration of new and existing solutions, which could include solutions for one or more functions of the enterprise, applying advanced technical capabilities. Collaborate with Service Lines, Sectors, Managed Services, Client Technology, Alliances and others to drive an integrated solution development and activation plan. Create sales and delivery collateral, online knowledge communities and support resources (e.g., client meeting decks, methods, delivery toolkits) with subject matter experts. Acts as an intermediary between the business / client community and the technical community, working with the business to understand and solve complex problems, presenting solutions and options in a simplified manner for clients / business. Microsoft Sustainability Manager configuration and customization: Analyse client needs and translate them into comprehensive MSM and Azure cloud solutions for managing emissions, waste, water, and other sustainability metrics. Configure and customize Microsoft Sustainability Manager to meet our specific data needs and reporting requirements. Develop automation routines and workflows for data ingestion, processing, and transformation. Integrate Sustainability Manager with other relevant data platforms and tools. Stay up to date on evolving ESG regulations, frameworks, and reporting standards. Power BI skills: Develop insightful dashboards and reports using Power BI to visualize and analyse key ESG metrics. Collaborate with stakeholders to identify data and reporting needs. Develop interactive reports and storytelling narratives to effectively communicate ESG performance. Designing and implementing data models: Lead the design and development of a robust data model to capture and integrate ESG data from various sources (internal systems, external datasets, etc.). Ensure the data model aligns with relevant ESG frameworks and reporting standards. Create clear documentation and maintain data lineage for transparency and traceability. Analyse and interpret large datasets relating to environmental, social, and governance performance. KPI (Key Performance Indicators) modelling and analysis: Define and develop relevant KPIs for tracking progress towards our ESG goals. Perform data analysis to identify trends, patterns, and insights related to ESG performance. Provide data-driven recommendations for improving our ESG footprint and decision-making. To qualify for the role, you must have: A bachelor's or master's degree. A minimum of 10-14 years of experience, preferably background in a professional services firm. 3+ years of experience in data architecture or analytics, preferably in the sustainability or ESG domain. Subject matter expertise in sustainability and relevant experience preferred (across any industry or competency) Experience managing large complex change management programs with multiple global stakeholders (required). Strong knowledge of Power Platform (Core), Power Apps (Canvas & MD), Power Automate. At least 6+ years of relevant experience on Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Power Portals/ Power Pages), Dynamics CRM / 365. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions etc. Able to effectively communicate with and manage diverse stakeholders across the business and enabling functions. Prior experience in go-to-market efforts Strong understanding of data modelling concepts and methodologies. Proven experience with Microsoft Azure and Power BI, including advanced functions and DAX scripting. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

4.0 - 8.0 years

14 - 24 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

4 years of hands-on experience in .NET, C#, MVC, SQL, and Web APIs development Familiarity with Function Apps, Cosmos Db, Durable Function Apps, Event Grid, Azure Data Factory, Logic Apps, Service Bus, and Storage Accounts is essential CTC upto 24LPA

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity. Show more Show less

Posted 3 weeks ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Purpose . For more information on PepsiCo and the opportunities it holds, visit www.pepsico.com. Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Machine Learning Services and Pipelines. PepsiCo Data Analytics & AI Overview: With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCo’s leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AIProducts. And will support “pre-engagement” activities as requested and validated by the prioritization framework of DA&AI. Data Scientist: Hyderabad and Gurugram You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Machine Learning and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 5+ years of experience working as a Data Scientist. 4+ years’ experience building solutions in the commercial or in the supply chain space. 4+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 4+ years’ experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 4+ years’ experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Skills, Abilities, Knowledge: Data Science - Hands on experience and strong knowledge of building machine learning models - supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills - Hands-on experience in statistical programming languages like Python, Pyspark and database query languages like SQL Statistics - Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) - Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics ‘product’ creation. Show more Show less

Posted 3 weeks ago

Apply

12.0 - 20.0 years

22 - 37 Lacs

Bengaluru

Hybrid

Naukri logo

12+ yrs of experience in Data Architecture Strong in Azure Data Services & Databricks, including Delta Lake & Unity Catalog Experience in Azure Synapse, Purview, ADF, DBT, Apache Spark,DWH,Data Lakes, NoSQL,OLTP NP-Immediate sachin@assertivebs.com

Posted 3 weeks ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure data ingestion and transformation components. Following technology skills are required – Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with ADF, Dataflow Experience with big data tools like Delta Lake, Azure Databricks Experience with Synapse Designing an Azure Data Solution skills Assemble large, complex data sets that meet functional / non-functional business requirements. Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Location Name: Pune Corporate Office - HO Job Purpose Effectively capable of handling Development + Support in PostgreSQL/Oracle/SQL Database technology Interact with cross functional business teams to understand business needs and prioritize requirements. Duties And Responsibilities Ability to develop End to End functionality/modules and write complex code, perform UAT and production deployment and manage support. Ability to manage a team of 5-6 Developers Defining project milestone and drive partner/development teams to ensure delivery on time. Responsible for Delivery schedule, Change process management, Project Monitoring and Status Reporting Work with internal IT teams to ensure delivery of the agreed solutions Test new builds for all scenarios & production outcomes Key Decisions / Dimensions Decisions in solutions through technology, innovation Tackle production issues Major Challenges Managing delivery & support as it is an added responsibility Managing Support Teams Fulfill the entire requirement within restricted timelines Required Qualifications And Experience Qualifications Min. Qualification required is Graduation. Good negotiation and communication skill. Work Experience Relevant work experience of 8 to 12 Years. Skills Keywords ORACLE SQL, PL/SQL, PROCEDURES, PACKAGES, CURSORS, TRIGGERS, FUNCTIONS, COMPLEX SQL QUERIES AND PLSQL CODE, partitioning techniques, data loading mechanisms, indexes other knowledge and experience in database design. PostgreSQL 12.0 MSSQL 2019 Oracle 11g, Oracle 12c. Oracle SQL developer OR PLSQL developer. ADF 2.0 Knowledge of GITHUB and DEVOPS using Azure pipelines. Hands on experience in query tuning and other optimization knowledge will be added advantage. Show more Show less

Posted 3 weeks ago

Apply

Exploring ADF Jobs in India

The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.

Top Hiring Locations in India

Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai

Average Salary Range

The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.

Related Skills

In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.

Interview Questions

Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?

Closing Remark

As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies