Jobs
Interviews

1432 Adf Jobs - Page 48

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Experience: 5+ years Notice Period: Immediate joiners Work Timings: 1PM – 10 PM Location: Gurgaon Work Mode: Hybrid Strong experience in SQL development along with experience in cloud AWS and good experience in ADF Role Responsibilities Design, develop, and implement database solutions using SQL Server and Azure Data Factory (ADF). Create and optimize complex SQL queries to ensure efficiency and effectiveness in data fetching. Build and manage data pipelines using ADF for data ingestion and transformation. Collaborate with stakeholders to gather requirements and understand data needs. Perform database maintenance tasks, including backups and recovery. Analyze and enhance SQL performance to reduce execution time. Work with Data Analysts to help visualize data and support reporting needs. Conduct code reviews and provide constructive feedback to peers. Document development processes and ensure adherence to best practices. Support system testing and troubleshoot issues as they arise. Participate in team meetings to discuss project updates and challenges. Ensure data security and compliance with relevant regulations. Continuously learn and apply the latest industry trends in database technologies. Assist in training junior developers and onboarding new team members. Contribute to agile project management by updating task progress in tracking tools. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 3 years of experience as a SQL Developer. Proficiency in SQL Server and ADF. Hands-on experience with ETL tools and methodologies. Strong understanding of database architectures and design. Familiarity with cloud technologies, especially Azure. Experience with version control systems like Git. Solid analytical and problem-solving skills. Ability to work effectively in a collaborative team environment. Excellent verbal and written communication skills. Knowledge of data warehousing concepts is a plus. Certifications in SQL or Azure Data Services are an advantage. Capability to handle multiple tasks and prioritize effectively. Detail-oriented with a focus on quality deliverables. Commitment to continuous learning and professional development. " Skills: agile methodologies,performance tuning,data analysis,adf,azure data factory,cloud technologies,data warehousing,git,problem solving,azure data factory (adf),sql server,database management,azure,sql,team collaboration,version control systems,etl tools Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Primary skills: Technology Big Data Big Table | Technology Cloud Integration | Azure Data Factory (ADF) | Technology | Data on Cloud - Platform | AWS Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Anupgarh, Rajasthan, India

Remote

Somos una corporación multinacional de bebidas y alimentos fundada en 1885 con operaciones en más 14 países, con más de 15,000 colaboradores. Tenemos el portafolio de bebidas más grande de la región, y contamos con socios estratégicos como PepsiCo y AB InBev. El último año hemos tenido una expansión a nivel global que nos ha llevado a dividirnos en 4 unidades de negocio: apex (transformación), cbc (distribución), beliv (innovación en bebidas) y bia (alimentos); y como parte de nuestra estrategia dinámica de expansión y crecimiento requerimos talentos para unirse a nuestra corporación. Apply directly at getonbrd.com. Funciones del cargo Diseñar e implementar soluciones de ingeniería de datos escalables, eficientes y mantenibles utilizando tecnologías como Azure Data Factory (ADF), Databricks y Unity Catalog, aplicando arquitecturas por capas (Bronze/Silver/Gold), automatización de ETL/ELT con validación de calidad de datos, y estrategias de integridad como pipelines idempotentes y manejo de SCD. El objetivo es garantizar datos confiables, optimizados en costos y performance, alineados con las necesidades del negocio, respaldados por documentación robusta y estándares de código (PEP8, Git) para facilitar su evolución y gobierno. Requerimientos del cargo Coordinar el funcionamiento de los distintos entornos donde se ejecutan los procesos de procesamiento de datos. Extraer, transformar y cargar los datos para que estén alineados con respecto a las necesidades del negocio. Generar integraciones eficientes que permitan realizar la ingesta de datos requeridos para la lógica de negocio. Generar flujos de integración continua que permitan validar los flujos desarrollados de forma eficaz. Mentorizar a igenieros juniors en buenas practicas y soluciones escalables. Proponer e implementar mejoras tecnologicas que optimicen los flujos de datos. Principales Retos Requiere criterio para diseñar, implementar y mantener una estructura de datos eficiente, escalable e intuitiva. Requiere criterio y experiencia para cumplir con las mejores prácticas de código para el desarrollo de funcionalidades competitivas en el mercado. Procesar volúmenes de datos en crecimiento exponencial sin que los costos en la nube se disparen. Implementar mecanismos de data quality que no impacten la velocidad de los procesamientos. GETONBRD Job ID: 53848 Conditions Health coverage Global Mobility Apex, S.A. pays or copays health insurance for employees. Computer provided Global Mobility Apex, S.A. provides a computer for your work. Informal dress code No dress code is enforced. Remote work policy Locally remote only Position is 100% remote, but candidates must reside in Chile, Colombia, Ecuador, Peru, Mexico, Guatemala or El Salvador. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Data Analyst (Snowflake) Job ID: POS-10121 Primary Skill: Python Location: Hyderabad Experience Secondary skills: Snowflake, ADF, and SQL Mode of Work: Work from Office Experience : 5-7 Years About The Job Are you someone with an in-depth understanding of ETL and a strong background in developing Snowflake and ADF ETL-based solutions who develop, document, unit test, and maintain ETL applications and deliver successful code meeting customer expectations? If yes, this opportunity can be the next step in your career. Read on. We are looking for a Snowflake and ADF developer to join our Data leverage team – a team of high-energy individuals who thrive in a rapid-pace and agile product development environment. As a Developer, you will provide accountability in the ETL and Data Integration space, from the development phase through delivery. You will work closely with the Project Manager, Technical Lead, and client teams. Your prime responsibilities will be to develop bug free code with proper unit testing and documentation. You will provide inputs to planning, estimation, scheduling, and coordination of technical activities related to ETL-based applications. You will be responsible for meeting development schedules and delivering high-quality ETL-based solutions that meet technical specifications and design requirements ensuring customer satisfaction. You are expected to possess good knowledge in tools – Snowflake and ADF. Know Your Team At ValueMomentum’s Engineering Center , we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through strong engineering foundation and continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. Join a team that invests in your growth. Our Infinity Program empowers you to build your career with role-specific skill development, leveraging immersive learning platforms. You'll have the opportunity to showcase your talents by contributing to impactful projects. Responsibilities Developing Modern Data Warehouse solutions using Snowflake and ADF. Ability to provide solutions that are forward-thinking in the data engineering and analytics space. Good understanding of star and snowflake dimensional modeling. Good knowledge of Snowflake security, Snowflake SQL, and designing other Snowflake objects Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Taks, Streams, Time travel, Cloning, Optimizer, data sharing, stored procedures, and UDFs. Good understanding of Databricks Data and Databricks Delta Lake Architecture. Experience in Azure Data Factory (ADF) to design, implement and manage complex data integration and transformation workflow. Good understanding of SDLC and Agile Methodologies. Strong problem-solving skills and analytical skills with proven strength in applying root-cause analysis. Ability to communicate verbally and in technical writing to all levels of the organization. Strong teamwork and interpersonal skills at all levels. Dedicated to excellence in one’s work; strong organizational skills; detail-oriented and thorough. Hands-on experience on support activities, able to create and resolve tickets – Jira, ServiceNow, Azure DevOps. Requirements Strong experience in Snowflake and ADF. Experience of working in an Onsite/Offshore model. 5+ years of experience in Snowflake and ADF development. About The Company Headquartered in New Jersey, US, ValueMomentum is the largest standalone provider of IT Services and Solutions to Insurers. Our industry focus, expertise in technology backed by R&D, and our customer-first approach uniquely position us to deliver the value we promise and drive momentum to our customers’ initiatives. ValueMomentum is amongst the top 10 insurance-focused IT services firms in North America by number of customers. Leading Insurance firms trust ValueMomentum with their Digital, Data, Core, and IT Transformation initiatives. Benefits We at ValueMomentum offer you a congenial environment to work and grow in the company of experienced professionals. Some benefits that are available to you are: Competitive compensation package. Career Advancement: Individual Career Development, coaching and mentoring programs for professional and leadership skill development. Comprehensive training and certification programs. Performance Management: Goal Setting, continuous feedback and year-end appraisal. Reward & recognition for the extraordinary performers. Show more Show less

Posted 1 month ago

Apply

7.0 - 10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. Experience: 7 to 10 years Mandatory Skills: Data Model Design, ER-diagram, Data Warehouse, Data Strategy, Hands on experience in Design and Architecture for enterprise data application. Good to have: Python, PySpark, Databricks, Azure Services(ADLS, ADF, ADB) Good communication and Problem-solving skills. Some understanding on CPG domain. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

K&K Talents is an international recruiting agency that has been providing technical resources globally since 1993. This position is with one of our clients in India , who is actively hiring candidates to expand their teams. Title: Data Engineer (SQL & ADF) Location: Gurgaon, India - Hybrid Employment Type: Full-time Permanent Notice Period: Immediate Role: We are seeking a skilled and proactive SQL + ADF Developer to join our client data engineering team. The ideal candidate will have strong hands-on experience in SQL development , Azure Data Factory (ADF) , and working knowledge of AWS cloud services . You will be responsible for building and maintaining scalable data integration solutions that support our business intelligence and analytics needs. Responsibilities: Develop, optimize, and maintain complex SQL queries, stored procedures , and scripts for large-scale data operations. Design and implement data pipelines using Azure Data Factory (ADF) for ETL/ELT processes. Integrate and move data between on-premise and cloud-based sources (Azure/AWS). Work with AWS services (e.g., S3, RDS, Glue, Lambda) for hybrid-cloud data workflows. Collaborate with data analysts, architects, and business teams to understand data requirements. Monitor, debug, and optimize ADF pipelines for performance and reliability. Document data flows, logic, and pipeline configurations for operational transparency. Participate in code reviews and follow data engineering best practices. Required Skills: Experience in SQL development , including performance tuning and stored procedures. Hands-on experience with Azure Data Factory (ADF) and building data pipelines. Working experience with AWS cloud services for data storage or movement. Experience with relational databases such as SQL Server, PostgreSQL, or MySQL. Good understanding of data integration concepts, scheduling, and monitoring. Strong problem-solving and analytical skills. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Required Skills: YOE-8+ Mode Of work: Remote Design, develop, modify, and test software applications for the healthcare industry in agile environment. Duties include: Develop. support/maintain and deploy software to support a variety of business needs Provide technical leadership in the design, development, testing, deployment and maintenance of software solutions Design and implement platform and application security for applications Perform advanced query analysis and performance troubleshooting Coordinate with senior-level stakeholders to ensure the development of innovative software solutions to complex technical and creative issues Re-design software applications to improve maintenance cost, testing functionality, platform independence and performance Manage user stories and project commitments in an agile framework to rapidly deliver value to customers deploy and operate software solutions using DevOps model. Required skills: Azure Deltalake, ADF, Databricks, PySpark, Oozie, Airflow, Big Data technologies( HBASE, HIVE), CI/CD (GitHub/Jenkins) Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Greetings from TCS! Job Title: Data Scientist / AI/ML Engg Required Skillset: AI/ML Location: Hyd,Kolkata, Delhi, Chennai(Last option) Experience Range: 6-10 Job Description Must-Have** AI/ML, Azure ML Studio, AI/ML On Databricks, Python & CICD Devops. Supervised and unsupervised ML and Predictive Analytics using Python • Feature generation through data exploration and SME requirements • Relational database querying • Applying computational algorithms and statistical methods to structured and unstructured data • Communicating results through data visualizations Programming Languages: Python, PySpark • Big Data Technologies: Spark with PySpark • Cloud Technologies: Azure (ADF, Databricks, Storage Account Usage, WebApp, Key vault, SQL Server, function app, logic app, Synapse, Azure Machine Learning, Azure DevOps) • RBAC Maintenance for Azure roles. • Github branching and managing • Terraform scripting for Azure IAA • Optional: GCP (Big Query, DataProc, Cloud Storage Thanks & Regards, Ria Aarthi A. Show more Show less

Posted 1 month ago

Apply

2.0 years

0 Lacs

Kochi, Kerala, India

On-site

Role Description Job Summary: We are seeking an experienced ADF Developer to design, build, and maintain data integration solutions using Azure Data Factory with exposure to Azure Databricks (ADB) . The ideal candidate will have hands-on expertise in ETL pipelines , data engineering , and Azure cloud services to support enterprise data initiatives. Key Responsibilities Design and develop scalable ETL pipelines using ADF. Integrate ADB for advanced data transformation tasks. Optimize and troubleshoot ADF pipelines and queries (SQL, Python, Scala). Implement robust data validation, error handling, and performance tuning. Collaborate with data architects, analysts, and DevOps teams. Maintain technical documentation and support ongoing solution improvements. Required Qualifications Bachelor’s/Master’s in Computer Science or related field. 2+ years of hands-on ADF experience. Strong skills in Python, SQL, and/or Scala. Familiarity with ADB and Azure cloud services. Solid knowledge of ETL, data warehousing, and performance optimization. Preferred Microsoft Azure Data Engineer certification. Exposure to Spark, Hadoop, Git, Agile practices, and domain-specific projects (finance, healthcare, retail). Understanding of data governance and compliance. Skills Adf,Adb,Datastage Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Kochi, Kerala, India

On-site

Role Description Job Summary: We are seeking a seasoned ADF Developer to design, implement, and optimize data integration solutions using Azure Data Factory (ADF) as the primary tool, with added experience in Azure Databricks (ADB) as a plus. The ideal candidate has strong ETL, data engineering, and cloud expertise within the Azure ecosystem . Key Responsibilities Design and develop ETL pipelines using ADF; integrate ADB for complex transformations. Write optimized Python, SQL, or Scala code for large-scale data processing. Configure ADF pipelines, datasets, linked services, and triggers. Ensure high data quality through robust validation, testing, and error handling. Optimize pipeline and query performance; troubleshoot issues proactively. Collaborate with data architects, analysts, and DevOps teams. Maintain clear documentation of pipeline logic and data flows. Support users and ensure minimal disruption to business operations. Required Skills 7+ years of hands-on ADF experience. Strong in Python, SQL, and/or Scala. Experience with ETL, data modeling, and Azure cloud tools. Familiarity with Azure Databricks. Excellent problem-solving and communication skills. Preferred Microsoft Azure Data Engineer Associate certification. Experience with Spark, Hadoop, Git, Agile, and data governance. Domain exposure: finance, healthcare, or retail. Skills Adf,Adb,Datastage Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

About The Job About Beyond Key We are a Microsoft Gold Partner and a Great Place to Work-certified company. "Happy Team Members, Happy Clients" is a principle we hold dear. We are an international IT consulting and software services firm committed to providing. Cutting-edge services and products that satisfy our clients' global needs. Our company was established in 2005, and since then we've expanded our team by including more than 350+ Talented skilled software professionals. Our clients come from the United States, Canada, Europe, Australia, the Middle East, and India, and we create and design IT solutions for them. If you need any more details, you can get them at https://www.beyondkey.com/about. Job Title: Senior Data Engineer ( Power BI, ADF & MS Fabric) Experience: 7+ years Location: Indore / Pune (Hybrid/Onsite) Job Type: Full-time Open Position : 1 Key Responsibilities Design, develop, and maintain interactive Power BI dashboards & reports with advanced DAX, Power Query, and custom visuals. Build and optimize end-to-end data solutions using Microsoft Fabric (OneLake, Lakehouse, Data Warehouse). Develop and automate ETL/ELT pipelines using Azure Data Factory (ADF) and Fabric Data Pipelines. Architect and manage modern data warehousing solutions (Star/Snowflake Schema) using Fabric Warehouse, Azure Synapse, or SQL Server. Implement data modeling, performance tuning, and optimization for large-scale datasets. Collaborate with business teams to translate requirements into scalable Fabric-based analytics solutions. Ensure data governance, security, and compliance across BI platforms. Mentor junior team members on Fabric, Power BI, and cloud data best practices. Required Skills & Qualifications 7+ years of hands-on experience in Power BI, SQL, Data Warehousing, and ETL/ELT. Strong expertise in Microsoft Fabric (Lakehouse, Warehouse, ETL workflows, Delta Lake). Proficient in Azure Data Factory (ADF) for orchestration and data integration. Advanced SQL (query optimization, stored procedures, partitioning). Experience with data warehousing (dimensional modeling, SCD, fact/dimension tables). Knowledge of Power BI Premium/Fabric capacity, deployment pipelines, and DAX patterns. Familiarity with Databricks, PySpark, or Python (for advanced analytics) is a plus. Strong problem-solving and stakeholder management skills. Preferred Qualifications Microsoft Certifications (PL-300: Power BI, DP-600: Fabric Analytics Engineer). Experience with Azure DevOps (CI/CD for Fabric/Power BI deployments). Domain knowledge in BFSI, Retail, or Manufacturing. Share with someone awesome View all job openings Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

JD for a Databricks Data Engineer Key Responsibilities: Design, develop, and maintain high-performance data pipelines using Databricks and Apache Spark. Implement medallion architecture (Bronze, Silver, Gold layers) for efficient data processing. Optimize Delta Lake tables, partitioning, Z-ordering, and performance tuning in Databricks. Develop ETL/ELT processes using PySpark, SQL, and Databricks Workflows. Manage Databricks clusters, jobs, and notebooks for batch and real-time data processing. Work with Azure Data Lake, AWS S3, or GCP Cloud Storage for data ingestion and storage. Implement CI/CD pipelines for Databricks jobs and notebooks using DevOps tools. Monitor and troubleshoot performance bottlenecks, cluster optimization, and cost management. Ensure data quality, governance, and security using Unity Catalog, ACLs, and encryption. Collaborate with Data Scientists, Analysts, and Business Teams to deliver insights. Required Skills & Experience: 5+ years of hands-on experience in Databricks, Apache Spark, and Delta Lake. Strong SQL, PySpark, and Python programming skills. Experience in Azure Data Factory (ADF), AWS Glue, or GCP Dataflow. Expertise in performance tuning, indexing, caching, and parallel processing. Hands-on experience with Lakehouse architecture and Databricks SQL. Strong understanding of data governance, lineage, and cataloging (e.g., Unity Catalog). Experience with CI/CD pipelines (Azure DevOps, GitHub Actions, or Jenkins). Familiarity with Airflow, Databricks Workflows, or orchestration tools. Strong problem-solving skills with experience in troubleshooting Spark jobs. Nice to Have: Hands-on experience with Kafka, Event Hubs, or real-time streaming in Databricks. Certifications in Databricks, Azure, AWS, or GCP. Show more Show less

Posted 1 month ago

Apply

0.0 - 5.0 years

0 Lacs

Pune, Maharashtra

On-site

Job details Employment Type: Full-Time Location: Pune, Maharashtra, India Job Category: Information Systems Job Number: WD30237233 Job Description At Johnson Controls, we’re shaping the future to create a world that’s safe, comfortable, and sustainable. Our global team creates innovative, integrated solutions that make cities more Software Developer – Data Solutions (ETL) Job Description: Johnson Controls is seeking an experienced ETL Developer responsible for designing, implementing, and managing ETL processes. The successful candidate will work closely with data architects, business analysts, and stakeholders to ensure data is extracted, transformed, and loaded accurately and efficiently for reporting and analytics purposes. Key Responsibilities o Design, develop, and implement ETL processes to extract data from various sources o Transform data to meet business requirements and load it into data warehouses or databases o Optimize ETL processes for performance and reliability o Collaborate with data architects and analysts to define data requirements and ensure data quality o Monitor ETL jobs and resolve issues as they arise o Create and maintain documentation of ETL processes and workflows o Participate in data modeling and database design Qualifications o Bachelor’s degree in computer science, Information Technology, or a related field o 3 to 5 years of experience as an ETL Developer or similar role o Strong knowledge of ETL tools – ADF, Synapse. Snowflake experience is mandatory. Multi cloud experience is a plus. o Proficient in SQL for data manipulation and querying o Experience with data warehousing concepts and methodologies o Knowledge of scripting languages (e.g., Python, Shell) is a plus o Excellent problem-solving skills and attention to detail o Strong communication skills to collaborate with technical and non-technical stakeholders o Candidates should be flexible / willing to work across this delivery landscape which includes and not limited to Agile Applications Development, Support and Deployment. o Expert level experience with Azure Data Lake, Azure Data Factory, Synapse, Azure Blob, Azure Storage Explorer, snowflake, Snowpark. What we offer Competitive salary and a comprehensive benefits package, including health, dental, and retirement plans. Opportunities for continuous professional development, training programs, and career advancement within the company. A collaborative, innovative, and inclusive work environment that values diversity and encourages creative problem-solving.

Posted 1 month ago

Apply

0.0 - 5.0 years

0 Lacs

Pune, Maharashtra

On-site

Job details Employment Type: Full-Time Location: Pune, Maharashtra, India Job Category: Information Systems Job Number: WD30237243 Job Description At Johnson Controls, we're shaping the future to create a world that's safe, comfortable, and sustainable. Join us and be part of a team that prioritizes innovation and customer satisfaction. What you will do: o Design, develop, and implement ETL processes to extract data from various sources o Transform data to meet business requirements and load it into data warehouses or databases o Optimize ETL processes for performance and reliability o Collaborate with data architects and analysts to define data requirements and ensure data quality o Monitor ETL jobs and resolve issues as they arise o Create and maintain documentation of ETL processes and workflows o Participate in data modeling and database design requirements and provide appropriate solutions. What we look for: Required: Bachelor’s degree in computer science, Information Technology, or a related field o 3 to 5 years of experience as an ETL Developer or similar role o Strong knowledge of ETL tools – ADF, Synapse. Snowflake experience is mandatory. Multi cloud experience is a plus. o Proficient in SQL for data manipulation and querying o Experience with data warehousing concepts and methodologies o Knowledge of scripting languages (e.g., Python, Shell) is a plus o Excellent problem-solving skills and attention to detail o Strong communication skills to collaborate with technical and non-technical stakeholders o Candidates should be flexible / willing to work across this delivery landscape which includes and not limited to Agile Applications Development, Support and Deployment. o Expert level experience with Azure Data Lake, Azure Data Factory, Synapse, Azure Blob, Azure Storage Explorer, snowflake, Snowpark. What we offer: Competitive salary and a comprehensive benefits package, including health, dental, and retirement plans. Opportunities for continuous professional development, training programs, and career advancement within the company. A collaborative, innovative, and inclusive work environment that values diversity and encourages creative problem-solving.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Skills: ORACLE Cc&B, Oracle Cloud, JAVA, PL/SQL, ORACLE ADF, ORACLE JET, Greetings from Colan Infotech!!! Job Title: Oracle CC&B Developer & Administrator (OCI) Location: Remote Department: IT / Enterprise Applications Job Summary We are looking for a highly skilled Oracle Customer Care & Billing (CC&B) Developer & Administrator with experience managing CC&B on Oracle Cloud Infrastructure (OCI). This role is critical to supporting and enhancing our utility billing platform through custom development, system upgrades, issue resolution, and infrastructure management. The ideal candidate is technically strong, detail-oriented, and experienced in both back-end and front-end CC&B development. Key Responsibilities Development & Customization Design and develop enhancements and custom modules for Oracle CC&B using Java, PL/SQL, Oracle ADF, and Oracle JET. Implement business rules, workflows, batch processes, and UI changes based on stakeholder requirements. Build RESTful APIs and integrations with internal and third-party systems (e.g., MDM, GIS, payment gateways). Upgrades & Maintenance Lead full lifecycle CC&B upgrades, including planning, testing, migration, and production deployment. Apply and test Oracle patches and interim fixes; resolve any post-patch issues. OCI Administration Manage CC&B environments hosted on Oracle Cloud Infrastructure (OCI) including Compute, Autonomous Database, Load Balancers, and Object Storage. Configure and monitor system performance using Oracle Enterprise Manager (OEM). Implement backup, recovery, and high-availability strategies aligned with security best practices. Support & Issue Resolution Provide daily operational support and issue resolution for CC&B application and infrastructure. Perform root cause analysis and deliver long-term fixes for recurring issues. Monitor, tune, and optimize system performance (JVM, SQL, WebLogic). Documentation & Collaboration Maintain detailed documentation including technical specs, runbooks, and support procedures. Collaborate with QA, infrastructure, and business teams to ensure smooth operations and releases. Use Bitbucket for version control and code collaboration. Required Qualifications Bachelor's degree in Computer Science, Engineering, or a related field. 5+ years of hands-on experience with Oracle CC&B development and administration. Proven experience with CC&B upgrades, patching, and environment management. Strong development skills in Java (8+), PL/SQL, Oracle ADF, and Oracle JET. Solid experience with OCI components including Compute, Autonomous Database, IAM, and networking. Proficiency with Oracle Enterprise Manager (OEM) for monitoring and diagnostics. Experience using Bitbucket or similar version control platforms. Strong problem-solving and communication skills. Ability to work both independently and as part of a cross-functional team. Preferred Qualifications Experience with Oracle SOA Suite or Oracle Integration Cloud. Knowledge of utility billing processes and customer service workflows. Experience working in agile or hybrid project environments. Interested candidates send your updated resume to kumudha.r@colanonline.com Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Role : Azure Data Engineer Experience : Minimum 3-5 years Location : Spaze ITech Park, Sector-49, Gurugram Working Days : Monday to Friday (9 : 00 Am- 6 : 00 Pm) Joining : < 15 days About Us Panamoure is UK based group with offshore centre in Gurgaon, India. We are known to be the ultimate Business and Technology Change partner for our clients including PE groups and ambitious mid-market businesses.Panamoure is a fast paced and dynamic management consultancy delivering Business and Technology change services to the UKs fastest growing companies. Our ability to deliver exceptional quality to our clients has seen us grow rapidly over the last 36 months and we have ambitious plans to scale substantially further moving forward. As part of this growth we are looking to expand both our UK and India team with bright, ambitious and talented individuals that want to learn and grow with the business. Primary Skills The Azure Data Engineer will be responsible for developing, maintaining, and optimizing data pipelines and SQL databases using Azure Data Factory (ADF), Microsoft Fabrics and other Azure services. The role requires expertise in SQL Server, ETL/ELT processes, and data modeling to support business intelligence and operational applications. The ideal candidate will collaborate with cross-functional teams to deliver reliable, scalable, and high-performing data solutions. Key Responsibilities Design, develop, and manage SQL databases, tables, stored procedures, and T-SQL queries. Develop and maintain Azure Data Factory (ADF) pipelines to automate data ingestion, transformation, and integration. Build and optimize ETL/ELT processes to transfer data between Azure Data Lake, SQL Server, and other systems. Design and implement Microsoft Fabric Lake houses for structured and unstructured data storage. Build scalable ETL/ELT pipelines to move and transform data across Azure Data Lake, SQL Server, and external data sources. Develop and implement data modeling strategies using star schema, snowflake schema, and dimensional models to support analytics use cases. Integrate Azure Data Lake Storage (ADLS) with Microsoft Fabric for scalable, secure, and cost-effective data storage. Monitor, troubleshoot, and optimize data pipelines using Azure Monitor, Log Analytics, and Fabric Monitoring capabilities. Ensure data integrity, consistency, and security following data governance frameworks such as Azure Purview. Collaborate with DevOps teams to implement CI/CD pipelines for automated data pipeline deployment. Utilize Azure Monitor, Log Analytics, and Application Insights for pipeline monitoring and performance optimization. Stay updated on Azure Data Services and Microsoft Fabric innovations, recommending enhancements for performance and scalability. Requirements 4+ years of experience in data engineering with strong expertise in SQL development. Proficiency in SQL Server, T-SQL, and query optimization techniques. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, and Azure SQL Database. Solid understanding of ETL/ELT processes, data integration patterns, and data transformation. Practical experience with Microsoft Fabric components : Fabric Dataflows for self-service data preparation. Fabric Lake houses for unified data storage. Fabric Synapse Real-Time Analytics for streaming data insights. Fabric Direct Lake mode with Power BI for optimized performance. Strong understanding of Azure Data Lake Storage (ADLS) for efficient data management. Proficiency in Python or Scala for data transformation tasks. Experience with Azure DevOps, Git, and CI/CD pipeline automation. Knowledge of data governance practices, including data lineage, sensitivity labels, and RBAC. Experience with Infrastructure-as-Code (IaC) using Terraform or ARM templates. Understanding of data security protocols like data encryption and network security groups (NSGs). Familiarity with streaming services like Azure Event Hub or Kafka is a plus. Excellent problem-solving, communication, and team collaboration skills. Azure Data Engineer Associate (DP-203) and Microsoft Fabric Analytics certifications are desirable. What We Offer Opportunity to work with modern data architectures and Microsoft Fabric innovations. Competitive salary and benefits package, tailored to experience and qualifications. Opportunities for professional growth and development in a supportive and collaborative environment. A culture that values diversity, creativity, and a commitment to excellence. Benefits And Perks Provident Fund Health Insurance Flexible Timing Providing office Lunch How To Apply Interested candidates should submit their resume and a cover letter detailing their experience with Data Engineer experience, SQL expertise and familiarity with Microsoft Fabrics to hr@panamoure.com We look forward to adding a skilled Azure Data Engineer to our team! (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

India

Remote

Hi, Please go through the below requirements and let me know your interest and forward your resume along with your contact information to raja@covetitinc.com Role : Data Engineer Location : Remote JOB PURPOSE This position will help design, develop and provide operational support for data integration/ ETL projects and activities. He or She will also be required to guide/ mentor other data engineers, coordinate / assign / oversee tasks related to ETL projects , work with functional analysts, end users and other BI team members to design effective ETL solutions/ data integration pipelines. ESSENTIAL FUNCTIONS AND RESPONSIBILITIES The following are the essential functions of this position. This position may be responsible for performing additional duties and tasks as needed and assigned. Technical design, development, testing, documentation of Data Warehouse / ETL projects Perform data profiling and logical / physical data modelling to build new ETL designs and solutions Develop, implement and deploy ETL solutions to update data warehouse and datamarts Maintain quality control, document technical specs and unit testing to ensure accuracy and quality of BI data Implement, stabilize and establish Dev Ops process for version control and deployment from non prod to prod environments Troubleshoot, debug and diagnose ETL issues Provide production support and work with other IT team members and end users to resolve data refresh issues – provide off hours operational support as needed Performance tuning and enhancement of SQL and ETL processes and prepare related technical documentation Work with Offshore team to coordinate development work and operational support Keep abreast of latest ETL technologies and plan effective use Be key player in planning migration of our EDW system to Modern global data warehouse architecture Assessment and implementation new EDW/ Cloud technologies to help evolve EDW architecture to efficiency and performance. Communicate very clearly and professionally with users, peers, and all levels of management. The communication forms include written and verbal methods Lead ETL tasks and activities related to BI projects, assign/ coordinate/ follow up on activities to meet ETL project timelines. Follow through and ensure proper closure of service request issues Help with AI/ ML projects as assigned Perform code reviews on ETL/ report changes where appropriate Coordinate with the DBA team on migration, configuration, tuning of ETL codes Act as mentor for other data engineers in the BI Team. Adhere to the processes and work policies defined by management Perform other duties as needed MINIMUM QUALIFICATIONS The requirements listed below are representative of the education, knowledge, skill and/or ability required for this position. Education/Certifications : Requires minimum of 8 years of related experience with a Bachelor’s degree in computer science, MIS, Data science or related field; or 6 years and a Master’s degree Experience, Skills, Knowledge and/or Abilities : Understanding of ERP business processes (Order to Cash, Procure to Pay , Record to report etc), data warehouse and BI concepts and ability to apply educational and practical experience to improvise business intelligence applications and provide simplified and standardized solutions to achieve the business objectives. Expert knowledge of data warehouse architecture – well versed with Modern Data warehouse concepts, EDW & Data Lake/ Cloud architecture Expertise in dimensional modeling, star schema designs including best practices for use of indexes, partitioning, and data loading. Advanced experience in SQL, writing Stored procedures and tuning SQL, preferably using Oracle PL/SQL Strong experience with Data integration tool using ADF ( Azure Data Factory) Well versed with database administration tasks and working with DBAs to monitor and resolve SQL / ETL issues and performance tuning Experience with Dev Ops process in ADF, preferably using GitHub. Experience in other version control tools helpful. Experience in trouble shooting data warehouse refresh issues and BI reports data validation with source systems. Excellent communication skills. Ability to organize and handle multiple tasks simultaneously. Ability to mentor/ coordinate activities for other data engineers as needed. PREFERRED QUALIFICATIONS The education, knowledge, skills and/or abilities listed below are preferred qualifications in addition to the minimum qualifications stated above. Additional Experience, Skills, Knowledge and/or Abilities : Preferred with experience working with Oracle EBS or any major ERP systems like SAP Preferred with experience use of AI/ ML – Experience in R, Python, Pyspark a plus Preferred with experience on Cloud EDW technologies like Databricks, Snowflake, Synapse Preferred experience with Microsoft Fabric, Data Lakehouse concepts and related reporting capabilities PHYSICAL REQUIREMENTS / ADVERSE WORKING CONDITIONS The physical requirements listed in this section include, but are not limited, to the motor/physical abilities, skills, and/or demands required of the position in order to successfully undertake the essential duties and responsibilities of this position. In accordance with the Americans with Disabilities Act (ADA), reasonable accommodations may be made to allow qualified individuals with a disability to perform the essential functions and responsibilities of the position. No additional physical requirements or essential functions for this position. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We are seeking an experienced and strategic Data to design, build, and optimize scalable, secure, and high-performance data solutions. You will play a pivotal role in shaping our data infrastructure, working with technologies such as Databricks, Azure Data Factory, Unity Catalog , and Spark , while aligning with best practices in data governance, pipeline automation , and performance optimization . Key Responsibilities: • Design and develop scalable data pipelines using Databricks and Medallion (Bronze, Silver, Gold layers). • Architect and implement data governance frameworks using Unity Catalog and related tools. • Write efficient PySpark and SQL code for data transformation, cleansing, and enrichment. • Build and manage data workflows in Azure Data Factory (ADF) including triggers, linked services, and integration runtimes. • Optimize queries and data structures for performance and cost-efficiency . • Develop and maintain CI/CD pipelines using GitHub for automated deployment and version control. • Collaborate with cross-functional teams to define data strategies and drive data quality initiatives. • Implement best practices for DevOps, CI/CD , and infrastructure-as-code in data engineering. • Troubleshoot and resolve performance bottlenecks across Spark, ADF, and Databricks pipelines. • Maintain comprehensive documentation of architecture, processes, and workflows . Requirements: • Bachelor’s or master’s degree in computer science, Information Systems, or related field. • Proven experience as a Data Architect or Senior Data Engineer. • Strong knowledge of Databricks , Azure Data Factory , Spark (PySpark) , and SQL . • Hands-on experience with data governance , security frameworks , and catalog management . • Proficiency in cloud platforms (preferably Azure). • Experience with CI/CD tools and version control systems like GitHub. • Strong communication and collaboration skills. Show more Show less

Posted 1 month ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Mumbai

Work from Office

#JobOpening Data Engineer (Contract | 6 Months) Location: Hyderabad | Chennai | Remote Flexibility Possible Type: Contract | Duration: 6 Months We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. #KeyResponsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes #RequiredSkills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities #NiceToHave Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS) #ContractDetails Role: Data Engineer Contract Duration: 6 Months Location Options: Hyderabad / Chennai (Remote flexibility available)

Posted 1 month ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Manager - MSM (Microsoft Sustainability Manager) Architect As an Architect on the GDS Consulting team within the Digital Engineering team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Oversees the design and deployment of the technical architecture, ensuring the appropriate expectations, principles, structures, tools, and responsibilities are in place to deliver excellence and risks are identified, managed, and mitigated. Analyse the chosen technologies against the implied target state and leverages good operational knowledge to identify technical and business gaps. Provides innovative and practical designs for the design and integration of new and existing solutions, which could include solutions for one or more functions of the enterprise, applying advanced technical capabilities. Collaborate with Service Lines, Sectors, Managed Services, Client Technology, Alliances and others to drive an integrated solution development and activation plan. Create sales and delivery collateral, online knowledge communities and support resources (e.g., client meeting decks, methods, delivery toolkits) with subject matter experts. Acts as an intermediary between the business / client community and the technical community, working with the business to understand and solve complex problems, presenting solutions and options in a simplified manner for clients / business. Microsoft Sustainability Manager configuration and customization: Analyse client needs and translate them into comprehensive MSM and Azure cloud solutions for managing emissions, waste, water, and other sustainability metrics. Configure and customize Microsoft Sustainability Manager to meet our specific data needs and reporting requirements. Develop automation routines and workflows for data ingestion, processing, and transformation. Integrate Sustainability Manager with other relevant data platforms and tools. Stay up to date on evolving ESG regulations, frameworks, and reporting standards. Power BI skills: Develop insightful dashboards and reports using Power BI to visualize and analyse key ESG metrics. Collaborate with stakeholders to identify data and reporting needs. Develop interactive reports and storytelling narratives to effectively communicate ESG performance. Designing and implementing data models: Lead the design and development of a robust data model to capture and integrate ESG data from various sources (internal systems, external datasets, etc.). Ensure the data model aligns with relevant ESG frameworks and reporting standards. Create clear documentation and maintain data lineage for transparency and traceability. Analyse and interpret large datasets relating to environmental, social, and governance performance. KPI (Key Performance Indicators) modelling and analysis: Define and develop relevant KPIs for tracking progress towards our ESG goals. Perform data analysis to identify trends, patterns, and insights related to ESG performance. Provide data-driven recommendations for improving our ESG footprint and decision-making. To qualify for the role, you must have: A bachelor's or master's degree. A minimum of 10-14 years of experience, preferably background in a professional services firm. 3+ years of experience in data architecture or analytics, preferably in the sustainability or ESG domain. Subject matter expertise in sustainability and relevant experience preferred (across any industry or competency) Experience managing large complex change management programs with multiple global stakeholders (required). Strong knowledge of Power Platform (Core), Power Apps (Canvas & MD), Power Automate. At least 6+ years of relevant experience on Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Power Portals/ Power Pages), Dynamics CRM / 365. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions etc. Able to effectively communicate with and manage diverse stakeholders across the business and enabling functions. Prior experience in go-to-market efforts Strong understanding of data modelling concepts and methodologies. Proven experience with Microsoft Azure and Power BI, including advanced functions and DAX scripting. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

7.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Manager - MSM (Microsoft Sustainability Manager) Architect As an Architect on the GDS Consulting team within the Digital Engineering team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Oversees the design and deployment of the technical architecture, ensuring the appropriate expectations, principles, structures, tools, and responsibilities are in place to deliver excellence and risks are identified, managed, and mitigated. Analyse the chosen technologies against the implied target state and leverages good operational knowledge to identify technical and business gaps. Provides innovative and practical designs for the design and integration of new and existing solutions, which could include solutions for one or more functions of the enterprise, applying advanced technical capabilities. Collaborate with Service Lines, Sectors, Managed Services, Client Technology, Alliances and others to drive an integrated solution development and activation plan. Create sales and delivery collateral, online knowledge communities and support resources (e.g., client meeting decks, methods, delivery toolkits) with subject matter experts. Acts as an intermediary between the business / client community and the technical community, working with the business to understand and solve complex problems, presenting solutions and options in a simplified manner for clients / business. Microsoft Sustainability Manager configuration and customization: Analyse client needs and translate them into comprehensive MSM and Azure cloud solutions for managing emissions, waste, water, and other sustainability metrics. Configure and customize Microsoft Sustainability Manager to meet our specific data needs and reporting requirements. Develop automation routines and workflows for data ingestion, processing, and transformation. Integrate Sustainability Manager with other relevant data platforms and tools. Stay up to date on evolving ESG regulations, frameworks, and reporting standards. Power BI skills: Develop insightful dashboards and reports using Power BI to visualize and analyse key ESG metrics. Collaborate with stakeholders to identify data and reporting needs. Develop interactive reports and storytelling narratives to effectively communicate ESG performance. Designing and implementing data models: Lead the design and development of a robust data model to capture and integrate ESG data from various sources (internal systems, external datasets, etc.). Ensure the data model aligns with relevant ESG frameworks and reporting standards. Create clear documentation and maintain data lineage for transparency and traceability. Analyse and interpret large datasets relating to environmental, social, and governance performance. KPI (Key Performance Indicators) modelling and analysis: Define and develop relevant KPIs for tracking progress towards our ESG goals. Perform data analysis to identify trends, patterns, and insights related to ESG performance. Provide data-driven recommendations for improving our ESG footprint and decision-making. To qualify for the role, you must have: A bachelor's or master's degree. A minimum of 10-14 years of experience, preferably background in a professional services firm. 3+ years of experience in data architecture or analytics, preferably in the sustainability or ESG domain. Subject matter expertise in sustainability and relevant experience preferred (across any industry or competency) Experience managing large complex change management programs with multiple global stakeholders (required). Strong knowledge of Power Platform (Core), Power Apps (Canvas & MD), Power Automate. At least 6+ years of relevant experience on Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Power Portals/ Power Pages), Dynamics CRM / 365. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions etc. Able to effectively communicate with and manage diverse stakeholders across the business and enabling functions. Prior experience in go-to-market efforts Strong understanding of data modelling concepts and methodologies. Proven experience with Microsoft Azure and Power BI, including advanced functions and DAX scripting. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

7.0 - 10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

7.0 - 10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies