Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
JD for a Databricks Data Engineer Key Responsibilities: Design, develop, and maintain high-performance data pipelines using Databricks and Apache Spark. Implement medallion architecture (Bronze, Silver, Gold layers) for efficient data processing. Optimize Delta Lake tables, partitioning, Z-ordering, and performance tuning in Databricks. Develop ETL/ELT processes using PySpark, SQL, and Databricks Workflows. Manage Databricks clusters, jobs, and notebooks for batch and real-time data processing. Work with Azure Data Lake, AWS S3, or GCP Cloud Storage for data ingestion and storage. Implement CI/CD pipelines for Databricks jobs and notebooks using DevOps tools. Monitor and troubleshoot performance bottlenecks, cluster optimization, and cost management. Ensure data quality, governance, and security using Unity Catalog, ACLs, and encryption. Collaborate with Data Scientists, Analysts, and Business Teams to deliver insights. Required Skills & Experience: 5+ years of hands-on experience in Databricks, Apache Spark, and Delta Lake. Strong SQL, PySpark, and Python programming skills. Experience in Azure Data Factory (ADF), AWS Glue, or GCP Dataflow. Expertise in performance tuning, indexing, caching, and parallel processing. Hands-on experience with Lakehouse architecture and Databricks SQL. Strong understanding of data governance, lineage, and cataloging (e.g., Unity Catalog). Experience with CI/CD pipelines (Azure DevOps, GitHub Actions, or Jenkins). Familiarity with Airflow, Databricks Workflows, or orchestration tools. Strong problem-solving skills with experience in troubleshooting Spark jobs. Nice to Have: Hands-on experience with Kafka, Event Hubs, or real-time streaming in Databricks. Certifications in Databricks, Azure, AWS, or GCP. Show more Show less
Posted 3 weeks ago
0.0 - 5.0 years
0 Lacs
Pune, Maharashtra
On-site
Job details Employment Type: Full-Time Location: Pune, Maharashtra, India Job Category: Information Systems Job Number: WD30237233 Job Description At Johnson Controls, we’re shaping the future to create a world that’s safe, comfortable, and sustainable. Our global team creates innovative, integrated solutions that make cities more Software Developer – Data Solutions (ETL) Job Description: Johnson Controls is seeking an experienced ETL Developer responsible for designing, implementing, and managing ETL processes. The successful candidate will work closely with data architects, business analysts, and stakeholders to ensure data is extracted, transformed, and loaded accurately and efficiently for reporting and analytics purposes. Key Responsibilities o Design, develop, and implement ETL processes to extract data from various sources o Transform data to meet business requirements and load it into data warehouses or databases o Optimize ETL processes for performance and reliability o Collaborate with data architects and analysts to define data requirements and ensure data quality o Monitor ETL jobs and resolve issues as they arise o Create and maintain documentation of ETL processes and workflows o Participate in data modeling and database design Qualifications o Bachelor’s degree in computer science, Information Technology, or a related field o 3 to 5 years of experience as an ETL Developer or similar role o Strong knowledge of ETL tools – ADF, Synapse. Snowflake experience is mandatory. Multi cloud experience is a plus. o Proficient in SQL for data manipulation and querying o Experience with data warehousing concepts and methodologies o Knowledge of scripting languages (e.g., Python, Shell) is a plus o Excellent problem-solving skills and attention to detail o Strong communication skills to collaborate with technical and non-technical stakeholders o Candidates should be flexible / willing to work across this delivery landscape which includes and not limited to Agile Applications Development, Support and Deployment. o Expert level experience with Azure Data Lake, Azure Data Factory, Synapse, Azure Blob, Azure Storage Explorer, snowflake, Snowpark. What we offer Competitive salary and a comprehensive benefits package, including health, dental, and retirement plans. Opportunities for continuous professional development, training programs, and career advancement within the company. A collaborative, innovative, and inclusive work environment that values diversity and encourages creative problem-solving.
Posted 3 weeks ago
0.0 - 5.0 years
0 Lacs
Pune, Maharashtra
On-site
Job details Employment Type: Full-Time Location: Pune, Maharashtra, India Job Category: Information Systems Job Number: WD30237243 Job Description At Johnson Controls, we're shaping the future to create a world that's safe, comfortable, and sustainable. Join us and be part of a team that prioritizes innovation and customer satisfaction. What you will do: o Design, develop, and implement ETL processes to extract data from various sources o Transform data to meet business requirements and load it into data warehouses or databases o Optimize ETL processes for performance and reliability o Collaborate with data architects and analysts to define data requirements and ensure data quality o Monitor ETL jobs and resolve issues as they arise o Create and maintain documentation of ETL processes and workflows o Participate in data modeling and database design requirements and provide appropriate solutions. What we look for: Required: Bachelor’s degree in computer science, Information Technology, or a related field o 3 to 5 years of experience as an ETL Developer or similar role o Strong knowledge of ETL tools – ADF, Synapse. Snowflake experience is mandatory. Multi cloud experience is a plus. o Proficient in SQL for data manipulation and querying o Experience with data warehousing concepts and methodologies o Knowledge of scripting languages (e.g., Python, Shell) is a plus o Excellent problem-solving skills and attention to detail o Strong communication skills to collaborate with technical and non-technical stakeholders o Candidates should be flexible / willing to work across this delivery landscape which includes and not limited to Agile Applications Development, Support and Deployment. o Expert level experience with Azure Data Lake, Azure Data Factory, Synapse, Azure Blob, Azure Storage Explorer, snowflake, Snowpark. What we offer: Competitive salary and a comprehensive benefits package, including health, dental, and retirement plans. Opportunities for continuous professional development, training programs, and career advancement within the company. A collaborative, innovative, and inclusive work environment that values diversity and encourages creative problem-solving.
Posted 3 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Skills: ORACLE Cc&B, Oracle Cloud, JAVA, PL/SQL, ORACLE ADF, ORACLE JET, Greetings from Colan Infotech!!! Job Title: Oracle CC&B Developer & Administrator (OCI) Location: Remote Department: IT / Enterprise Applications Job Summary We are looking for a highly skilled Oracle Customer Care & Billing (CC&B) Developer & Administrator with experience managing CC&B on Oracle Cloud Infrastructure (OCI). This role is critical to supporting and enhancing our utility billing platform through custom development, system upgrades, issue resolution, and infrastructure management. The ideal candidate is technically strong, detail-oriented, and experienced in both back-end and front-end CC&B development. Key Responsibilities Development & Customization Design and develop enhancements and custom modules for Oracle CC&B using Java, PL/SQL, Oracle ADF, and Oracle JET. Implement business rules, workflows, batch processes, and UI changes based on stakeholder requirements. Build RESTful APIs and integrations with internal and third-party systems (e.g., MDM, GIS, payment gateways). Upgrades & Maintenance Lead full lifecycle CC&B upgrades, including planning, testing, migration, and production deployment. Apply and test Oracle patches and interim fixes; resolve any post-patch issues. OCI Administration Manage CC&B environments hosted on Oracle Cloud Infrastructure (OCI) including Compute, Autonomous Database, Load Balancers, and Object Storage. Configure and monitor system performance using Oracle Enterprise Manager (OEM). Implement backup, recovery, and high-availability strategies aligned with security best practices. Support & Issue Resolution Provide daily operational support and issue resolution for CC&B application and infrastructure. Perform root cause analysis and deliver long-term fixes for recurring issues. Monitor, tune, and optimize system performance (JVM, SQL, WebLogic). Documentation & Collaboration Maintain detailed documentation including technical specs, runbooks, and support procedures. Collaborate with QA, infrastructure, and business teams to ensure smooth operations and releases. Use Bitbucket for version control and code collaboration. Required Qualifications Bachelor's degree in Computer Science, Engineering, or a related field. 5+ years of hands-on experience with Oracle CC&B development and administration. Proven experience with CC&B upgrades, patching, and environment management. Strong development skills in Java (8+), PL/SQL, Oracle ADF, and Oracle JET. Solid experience with OCI components including Compute, Autonomous Database, IAM, and networking. Proficiency with Oracle Enterprise Manager (OEM) for monitoring and diagnostics. Experience using Bitbucket or similar version control platforms. Strong problem-solving and communication skills. Ability to work both independently and as part of a cross-functional team. Preferred Qualifications Experience with Oracle SOA Suite or Oracle Integration Cloud. Knowledge of utility billing processes and customer service workflows. Experience working in agile or hybrid project environments. Interested candidates send your updated resume to kumudha.r@colanonline.com Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role : Azure Data Engineer Experience : Minimum 3-5 years Location : Spaze ITech Park, Sector-49, Gurugram Working Days : Monday to Friday (9 : 00 Am- 6 : 00 Pm) Joining : < 15 days About Us Panamoure is UK based group with offshore centre in Gurgaon, India. We are known to be the ultimate Business and Technology Change partner for our clients including PE groups and ambitious mid-market businesses.Panamoure is a fast paced and dynamic management consultancy delivering Business and Technology change services to the UKs fastest growing companies. Our ability to deliver exceptional quality to our clients has seen us grow rapidly over the last 36 months and we have ambitious plans to scale substantially further moving forward. As part of this growth we are looking to expand both our UK and India team with bright, ambitious and talented individuals that want to learn and grow with the business. Primary Skills The Azure Data Engineer will be responsible for developing, maintaining, and optimizing data pipelines and SQL databases using Azure Data Factory (ADF), Microsoft Fabrics and other Azure services. The role requires expertise in SQL Server, ETL/ELT processes, and data modeling to support business intelligence and operational applications. The ideal candidate will collaborate with cross-functional teams to deliver reliable, scalable, and high-performing data solutions. Key Responsibilities Design, develop, and manage SQL databases, tables, stored procedures, and T-SQL queries. Develop and maintain Azure Data Factory (ADF) pipelines to automate data ingestion, transformation, and integration. Build and optimize ETL/ELT processes to transfer data between Azure Data Lake, SQL Server, and other systems. Design and implement Microsoft Fabric Lake houses for structured and unstructured data storage. Build scalable ETL/ELT pipelines to move and transform data across Azure Data Lake, SQL Server, and external data sources. Develop and implement data modeling strategies using star schema, snowflake schema, and dimensional models to support analytics use cases. Integrate Azure Data Lake Storage (ADLS) with Microsoft Fabric for scalable, secure, and cost-effective data storage. Monitor, troubleshoot, and optimize data pipelines using Azure Monitor, Log Analytics, and Fabric Monitoring capabilities. Ensure data integrity, consistency, and security following data governance frameworks such as Azure Purview. Collaborate with DevOps teams to implement CI/CD pipelines for automated data pipeline deployment. Utilize Azure Monitor, Log Analytics, and Application Insights for pipeline monitoring and performance optimization. Stay updated on Azure Data Services and Microsoft Fabric innovations, recommending enhancements for performance and scalability. Requirements 4+ years of experience in data engineering with strong expertise in SQL development. Proficiency in SQL Server, T-SQL, and query optimization techniques. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, and Azure SQL Database. Solid understanding of ETL/ELT processes, data integration patterns, and data transformation. Practical experience with Microsoft Fabric components : Fabric Dataflows for self-service data preparation. Fabric Lake houses for unified data storage. Fabric Synapse Real-Time Analytics for streaming data insights. Fabric Direct Lake mode with Power BI for optimized performance. Strong understanding of Azure Data Lake Storage (ADLS) for efficient data management. Proficiency in Python or Scala for data transformation tasks. Experience with Azure DevOps, Git, and CI/CD pipeline automation. Knowledge of data governance practices, including data lineage, sensitivity labels, and RBAC. Experience with Infrastructure-as-Code (IaC) using Terraform or ARM templates. Understanding of data security protocols like data encryption and network security groups (NSGs). Familiarity with streaming services like Azure Event Hub or Kafka is a plus. Excellent problem-solving, communication, and team collaboration skills. Azure Data Engineer Associate (DP-203) and Microsoft Fabric Analytics certifications are desirable. What We Offer Opportunity to work with modern data architectures and Microsoft Fabric innovations. Competitive salary and benefits package, tailored to experience and qualifications. Opportunities for professional growth and development in a supportive and collaborative environment. A culture that values diversity, creativity, and a commitment to excellence. Benefits And Perks Provident Fund Health Insurance Flexible Timing Providing office Lunch How To Apply Interested candidates should submit their resume and a cover letter detailing their experience with Data Engineer experience, SQL expertise and familiarity with Microsoft Fabrics to hr@panamoure.com We look forward to adding a skilled Azure Data Engineer to our team! (ref:hirist.tech) Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
India
Remote
Hi, Please go through the below requirements and let me know your interest and forward your resume along with your contact information to raja@covetitinc.com Role : Data Engineer Location : Remote JOB PURPOSE This position will help design, develop and provide operational support for data integration/ ETL projects and activities. He or She will also be required to guide/ mentor other data engineers, coordinate / assign / oversee tasks related to ETL projects , work with functional analysts, end users and other BI team members to design effective ETL solutions/ data integration pipelines. ESSENTIAL FUNCTIONS AND RESPONSIBILITIES The following are the essential functions of this position. This position may be responsible for performing additional duties and tasks as needed and assigned. Technical design, development, testing, documentation of Data Warehouse / ETL projects Perform data profiling and logical / physical data modelling to build new ETL designs and solutions Develop, implement and deploy ETL solutions to update data warehouse and datamarts Maintain quality control, document technical specs and unit testing to ensure accuracy and quality of BI data Implement, stabilize and establish Dev Ops process for version control and deployment from non prod to prod environments Troubleshoot, debug and diagnose ETL issues Provide production support and work with other IT team members and end users to resolve data refresh issues – provide off hours operational support as needed Performance tuning and enhancement of SQL and ETL processes and prepare related technical documentation Work with Offshore team to coordinate development work and operational support Keep abreast of latest ETL technologies and plan effective use Be key player in planning migration of our EDW system to Modern global data warehouse architecture Assessment and implementation new EDW/ Cloud technologies to help evolve EDW architecture to efficiency and performance. Communicate very clearly and professionally with users, peers, and all levels of management. The communication forms include written and verbal methods Lead ETL tasks and activities related to BI projects, assign/ coordinate/ follow up on activities to meet ETL project timelines. Follow through and ensure proper closure of service request issues Help with AI/ ML projects as assigned Perform code reviews on ETL/ report changes where appropriate Coordinate with the DBA team on migration, configuration, tuning of ETL codes Act as mentor for other data engineers in the BI Team. Adhere to the processes and work policies defined by management Perform other duties as needed MINIMUM QUALIFICATIONS The requirements listed below are representative of the education, knowledge, skill and/or ability required for this position. Education/Certifications : Requires minimum of 8 years of related experience with a Bachelor’s degree in computer science, MIS, Data science or related field; or 6 years and a Master’s degree Experience, Skills, Knowledge and/or Abilities : Understanding of ERP business processes (Order to Cash, Procure to Pay , Record to report etc), data warehouse and BI concepts and ability to apply educational and practical experience to improvise business intelligence applications and provide simplified and standardized solutions to achieve the business objectives. Expert knowledge of data warehouse architecture – well versed with Modern Data warehouse concepts, EDW & Data Lake/ Cloud architecture Expertise in dimensional modeling, star schema designs including best practices for use of indexes, partitioning, and data loading. Advanced experience in SQL, writing Stored procedures and tuning SQL, preferably using Oracle PL/SQL Strong experience with Data integration tool using ADF ( Azure Data Factory) Well versed with database administration tasks and working with DBAs to monitor and resolve SQL / ETL issues and performance tuning Experience with Dev Ops process in ADF, preferably using GitHub. Experience in other version control tools helpful. Experience in trouble shooting data warehouse refresh issues and BI reports data validation with source systems. Excellent communication skills. Ability to organize and handle multiple tasks simultaneously. Ability to mentor/ coordinate activities for other data engineers as needed. PREFERRED QUALIFICATIONS The education, knowledge, skills and/or abilities listed below are preferred qualifications in addition to the minimum qualifications stated above. Additional Experience, Skills, Knowledge and/or Abilities : Preferred with experience working with Oracle EBS or any major ERP systems like SAP Preferred with experience use of AI/ ML – Experience in R, Python, Pyspark a plus Preferred with experience on Cloud EDW technologies like Databricks, Snowflake, Synapse Preferred experience with Microsoft Fabric, Data Lakehouse concepts and related reporting capabilities PHYSICAL REQUIREMENTS / ADVERSE WORKING CONDITIONS The physical requirements listed in this section include, but are not limited, to the motor/physical abilities, skills, and/or demands required of the position in order to successfully undertake the essential duties and responsibilities of this position. In accordance with the Americans with Disabilities Act (ADA), reasonable accommodations may be made to allow qualified individuals with a disability to perform the essential functions and responsibilities of the position. No additional physical requirements or essential functions for this position. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are seeking an experienced and strategic Data to design, build, and optimize scalable, secure, and high-performance data solutions. You will play a pivotal role in shaping our data infrastructure, working with technologies such as Databricks, Azure Data Factory, Unity Catalog , and Spark , while aligning with best practices in data governance, pipeline automation , and performance optimization . Key Responsibilities: • Design and develop scalable data pipelines using Databricks and Medallion (Bronze, Silver, Gold layers). • Architect and implement data governance frameworks using Unity Catalog and related tools. • Write efficient PySpark and SQL code for data transformation, cleansing, and enrichment. • Build and manage data workflows in Azure Data Factory (ADF) including triggers, linked services, and integration runtimes. • Optimize queries and data structures for performance and cost-efficiency . • Develop and maintain CI/CD pipelines using GitHub for automated deployment and version control. • Collaborate with cross-functional teams to define data strategies and drive data quality initiatives. • Implement best practices for DevOps, CI/CD , and infrastructure-as-code in data engineering. • Troubleshoot and resolve performance bottlenecks across Spark, ADF, and Databricks pipelines. • Maintain comprehensive documentation of architecture, processes, and workflows . Requirements: • Bachelor’s or master’s degree in computer science, Information Systems, or related field. • Proven experience as a Data Architect or Senior Data Engineer. • Strong knowledge of Databricks , Azure Data Factory , Spark (PySpark) , and SQL . • Hands-on experience with data governance , security frameworks , and catalog management . • Proficiency in cloud platforms (preferably Azure). • Experience with CI/CD tools and version control systems like GitHub. • Strong communication and collaboration skills. Show more Show less
Posted 3 weeks ago
6.0 - 10.0 years
8 - 12 Lacs
Mumbai
Work from Office
#JobOpening Data Engineer (Contract | 6 Months) Location: Hyderabad | Chennai | Remote Flexibility Possible Type: Contract | Duration: 6 Months We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. #KeyResponsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes #RequiredSkills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities #NiceToHave Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS) #ContractDetails Role: Data Engineer Contract Duration: 6 Months Location Options: Hyderabad / Chennai (Remote flexibility available)
Posted 3 weeks ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Manager - MSM (Microsoft Sustainability Manager) Architect As an Architect on the GDS Consulting team within the Digital Engineering team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Oversees the design and deployment of the technical architecture, ensuring the appropriate expectations, principles, structures, tools, and responsibilities are in place to deliver excellence and risks are identified, managed, and mitigated. Analyse the chosen technologies against the implied target state and leverages good operational knowledge to identify technical and business gaps. Provides innovative and practical designs for the design and integration of new and existing solutions, which could include solutions for one or more functions of the enterprise, applying advanced technical capabilities. Collaborate with Service Lines, Sectors, Managed Services, Client Technology, Alliances and others to drive an integrated solution development and activation plan. Create sales and delivery collateral, online knowledge communities and support resources (e.g., client meeting decks, methods, delivery toolkits) with subject matter experts. Acts as an intermediary between the business / client community and the technical community, working with the business to understand and solve complex problems, presenting solutions and options in a simplified manner for clients / business. Microsoft Sustainability Manager configuration and customization: Analyse client needs and translate them into comprehensive MSM and Azure cloud solutions for managing emissions, waste, water, and other sustainability metrics. Configure and customize Microsoft Sustainability Manager to meet our specific data needs and reporting requirements. Develop automation routines and workflows for data ingestion, processing, and transformation. Integrate Sustainability Manager with other relevant data platforms and tools. Stay up to date on evolving ESG regulations, frameworks, and reporting standards. Power BI skills: Develop insightful dashboards and reports using Power BI to visualize and analyse key ESG metrics. Collaborate with stakeholders to identify data and reporting needs. Develop interactive reports and storytelling narratives to effectively communicate ESG performance. Designing and implementing data models: Lead the design and development of a robust data model to capture and integrate ESG data from various sources (internal systems, external datasets, etc.). Ensure the data model aligns with relevant ESG frameworks and reporting standards. Create clear documentation and maintain data lineage for transparency and traceability. Analyse and interpret large datasets relating to environmental, social, and governance performance. KPI (Key Performance Indicators) modelling and analysis: Define and develop relevant KPIs for tracking progress towards our ESG goals. Perform data analysis to identify trends, patterns, and insights related to ESG performance. Provide data-driven recommendations for improving our ESG footprint and decision-making. To qualify for the role, you must have: A bachelor's or master's degree. A minimum of 10-14 years of experience, preferably background in a professional services firm. 3+ years of experience in data architecture or analytics, preferably in the sustainability or ESG domain. Subject matter expertise in sustainability and relevant experience preferred (across any industry or competency) Experience managing large complex change management programs with multiple global stakeholders (required). Strong knowledge of Power Platform (Core), Power Apps (Canvas & MD), Power Automate. At least 6+ years of relevant experience on Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Power Portals/ Power Pages), Dynamics CRM / 365. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions etc. Able to effectively communicate with and manage diverse stakeholders across the business and enabling functions. Prior experience in go-to-market efforts Strong understanding of data modelling concepts and methodologies. Proven experience with Microsoft Azure and Power BI, including advanced functions and DAX scripting. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
7.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Manager - MSM (Microsoft Sustainability Manager) Architect As an Architect on the GDS Consulting team within the Digital Engineering team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Oversees the design and deployment of the technical architecture, ensuring the appropriate expectations, principles, structures, tools, and responsibilities are in place to deliver excellence and risks are identified, managed, and mitigated. Analyse the chosen technologies against the implied target state and leverages good operational knowledge to identify technical and business gaps. Provides innovative and practical designs for the design and integration of new and existing solutions, which could include solutions for one or more functions of the enterprise, applying advanced technical capabilities. Collaborate with Service Lines, Sectors, Managed Services, Client Technology, Alliances and others to drive an integrated solution development and activation plan. Create sales and delivery collateral, online knowledge communities and support resources (e.g., client meeting decks, methods, delivery toolkits) with subject matter experts. Acts as an intermediary between the business / client community and the technical community, working with the business to understand and solve complex problems, presenting solutions and options in a simplified manner for clients / business. Microsoft Sustainability Manager configuration and customization: Analyse client needs and translate them into comprehensive MSM and Azure cloud solutions for managing emissions, waste, water, and other sustainability metrics. Configure and customize Microsoft Sustainability Manager to meet our specific data needs and reporting requirements. Develop automation routines and workflows for data ingestion, processing, and transformation. Integrate Sustainability Manager with other relevant data platforms and tools. Stay up to date on evolving ESG regulations, frameworks, and reporting standards. Power BI skills: Develop insightful dashboards and reports using Power BI to visualize and analyse key ESG metrics. Collaborate with stakeholders to identify data and reporting needs. Develop interactive reports and storytelling narratives to effectively communicate ESG performance. Designing and implementing data models: Lead the design and development of a robust data model to capture and integrate ESG data from various sources (internal systems, external datasets, etc.). Ensure the data model aligns with relevant ESG frameworks and reporting standards. Create clear documentation and maintain data lineage for transparency and traceability. Analyse and interpret large datasets relating to environmental, social, and governance performance. KPI (Key Performance Indicators) modelling and analysis: Define and develop relevant KPIs for tracking progress towards our ESG goals. Perform data analysis to identify trends, patterns, and insights related to ESG performance. Provide data-driven recommendations for improving our ESG footprint and decision-making. To qualify for the role, you must have: A bachelor's or master's degree. A minimum of 10-14 years of experience, preferably background in a professional services firm. 3+ years of experience in data architecture or analytics, preferably in the sustainability or ESG domain. Subject matter expertise in sustainability and relevant experience preferred (across any industry or competency) Experience managing large complex change management programs with multiple global stakeholders (required). Strong knowledge of Power Platform (Core), Power Apps (Canvas & MD), Power Automate. At least 6+ years of relevant experience on Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Power Portals/ Power Pages), Dynamics CRM / 365. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions etc. Able to effectively communicate with and manage diverse stakeholders across the business and enabling functions. Prior experience in go-to-market efforts Strong understanding of data modelling concepts and methodologies. Proven experience with Microsoft Azure and Power BI, including advanced functions and DAX scripting. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
7.0 - 10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
7.0 - 10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Manager - MSM (Microsoft Sustainability Manager) Architect As an Architect on the GDS Consulting team within the Digital Engineering team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Oversees the design and deployment of the technical architecture, ensuring the appropriate expectations, principles, structures, tools, and responsibilities are in place to deliver excellence and risks are identified, managed, and mitigated. Analyse the chosen technologies against the implied target state and leverages good operational knowledge to identify technical and business gaps. Provides innovative and practical designs for the design and integration of new and existing solutions, which could include solutions for one or more functions of the enterprise, applying advanced technical capabilities. Collaborate with Service Lines, Sectors, Managed Services, Client Technology, Alliances and others to drive an integrated solution development and activation plan. Create sales and delivery collateral, online knowledge communities and support resources (e.g., client meeting decks, methods, delivery toolkits) with subject matter experts. Acts as an intermediary between the business / client community and the technical community, working with the business to understand and solve complex problems, presenting solutions and options in a simplified manner for clients / business. Microsoft Sustainability Manager configuration and customization: Analyse client needs and translate them into comprehensive MSM and Azure cloud solutions for managing emissions, waste, water, and other sustainability metrics. Configure and customize Microsoft Sustainability Manager to meet our specific data needs and reporting requirements. Develop automation routines and workflows for data ingestion, processing, and transformation. Integrate Sustainability Manager with other relevant data platforms and tools. Stay up to date on evolving ESG regulations, frameworks, and reporting standards. Power BI skills: Develop insightful dashboards and reports using Power BI to visualize and analyse key ESG metrics. Collaborate with stakeholders to identify data and reporting needs. Develop interactive reports and storytelling narratives to effectively communicate ESG performance. Designing and implementing data models: Lead the design and development of a robust data model to capture and integrate ESG data from various sources (internal systems, external datasets, etc.). Ensure the data model aligns with relevant ESG frameworks and reporting standards. Create clear documentation and maintain data lineage for transparency and traceability. Analyse and interpret large datasets relating to environmental, social, and governance performance. KPI (Key Performance Indicators) modelling and analysis: Define and develop relevant KPIs for tracking progress towards our ESG goals. Perform data analysis to identify trends, patterns, and insights related to ESG performance. Provide data-driven recommendations for improving our ESG footprint and decision-making. To qualify for the role, you must have: A bachelor's or master's degree. A minimum of 10-14 years of experience, preferably background in a professional services firm. 3+ years of experience in data architecture or analytics, preferably in the sustainability or ESG domain. Subject matter expertise in sustainability and relevant experience preferred (across any industry or competency) Experience managing large complex change management programs with multiple global stakeholders (required). Strong knowledge of Power Platform (Core), Power Apps (Canvas & MD), Power Automate. At least 6+ years of relevant experience on Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Power Portals/ Power Pages), Dynamics CRM / 365. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions etc. Able to effectively communicate with and manage diverse stakeholders across the business and enabling functions. Prior experience in go-to-market efforts Strong understanding of data modelling concepts and methodologies. Proven experience with Microsoft Azure and Power BI, including advanced functions and DAX scripting. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
4.0 - 8.0 years
14 - 24 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
4 years of hands-on experience in .NET, C#, MVC, SQL, and Web APIs development Familiarity with Function Apps, Cosmos Db, Durable Function Apps, Event Grid, Azure Data Factory, Logic Apps, Service Bus, and Storage Accounts is essential CTC upto 24LPA
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity. Show more Show less
Posted 3 weeks ago
6.0 - 11.0 years
8 - 12 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote
Posted 3 weeks ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Purpose . For more information on PepsiCo and the opportunities it holds, visit www.pepsico.com. Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Machine Learning Services and Pipelines. PepsiCo Data Analytics & AI Overview: With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCo’s leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AIProducts. And will support “pre-engagement” activities as requested and validated by the prioritization framework of DA&AI. Data Scientist: Hyderabad and Gurugram You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Machine Learning and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 5+ years of experience working as a Data Scientist. 4+ years’ experience building solutions in the commercial or in the supply chain space. 4+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 4+ years’ experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 4+ years’ experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Skills, Abilities, Knowledge: Data Science - Hands on experience and strong knowledge of building machine learning models - supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills - Hands-on experience in statistical programming languages like Python, Pyspark and database query languages like SQL Statistics - Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) - Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics ‘product’ creation. Show more Show less
Posted 3 weeks ago
12.0 - 20.0 years
22 - 37 Lacs
Bengaluru
Hybrid
12+ yrs of experience in Data Architecture Strong in Azure Data Services & Databricks, including Delta Lake & Unity Catalog Experience in Azure Synapse, Purview, ADF, DBT, Apache Spark,DWH,Data Lakes, NoSQL,OLTP NP-Immediate sachin@assertivebs.com
Posted 3 weeks ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure data ingestion and transformation components. Following technology skills are required – Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with ADF, Dataflow Experience with big data tools like Delta Lake, Azure Databricks Experience with Synapse Designing an Azure Data Solution skills Assemble large, complex data sets that meet functional / non-functional business requirements. Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Location Name: Pune Corporate Office - HO Job Purpose Effectively capable of handling Development + Support in PostgreSQL/Oracle/SQL Database technology Interact with cross functional business teams to understand business needs and prioritize requirements. Duties And Responsibilities Ability to develop End to End functionality/modules and write complex code, perform UAT and production deployment and manage support. Ability to manage a team of 5-6 Developers Defining project milestone and drive partner/development teams to ensure delivery on time. Responsible for Delivery schedule, Change process management, Project Monitoring and Status Reporting Work with internal IT teams to ensure delivery of the agreed solutions Test new builds for all scenarios & production outcomes Key Decisions / Dimensions Decisions in solutions through technology, innovation Tackle production issues Major Challenges Managing delivery & support as it is an added responsibility Managing Support Teams Fulfill the entire requirement within restricted timelines Required Qualifications And Experience Qualifications Min. Qualification required is Graduation. Good negotiation and communication skill. Work Experience Relevant work experience of 8 to 12 Years. Skills Keywords ORACLE SQL, PL/SQL, PROCEDURES, PACKAGES, CURSORS, TRIGGERS, FUNCTIONS, COMPLEX SQL QUERIES AND PLSQL CODE, partitioning techniques, data loading mechanisms, indexes other knowledge and experience in database design. PostgreSQL 12.0 MSSQL 2019 Oracle 11g, Oracle 12c. Oracle SQL developer OR PLSQL developer. ADF 2.0 Knowledge of GITHUB and DEVOPS using Azure pipelines. Hands on experience in query tuning and other optimization knowledge will be added advantage. Show more Show less
Posted 3 weeks ago
10.0 - 12.0 years
25 - 27 Lacs
Indore, Hyderabad, Pune
Work from Office
We are seeking a skilled Lead Data Engineer with extensive experience in Snowflake, ADF, SQL, and other relevant data technologies to join our team. As a key member of our data engineering team, you will play an instrumental role in designing, developing, and managing data pipelines, working closely with cross-functional teams to drive the success of our data initiatives. Key Responsibilities: Design, implement, and maintain data solutions using Snowflake, ADF, and SQL Server to ensure data integrity, scalability, and high performance. Lead and contribute to the development of data pipelines, ETL processes, and data integration solutions, ensuring the smooth extraction, transformation, and loading of data from diverse sources. Work with MSBI, SSIS, and Azure Data Lake Storage to optimize data flows and storage solutions. Collaborate with business and technical teams to identify project needs, estimate tasks, and set intermediate milestones to achieve final outcomes. Implement industry best practices related to Business Intelligence and Data Management, ensuring adherence to usability, design, and development standards. Perform in-depth data analysis to resolve data issues and improve overall data quality. Mentor and guide junior data engineers, providing technical expertise and supporting the development of their skills. Effectively collaborate with geographically distributed teams to ensure project goals are met in a timely manner. Required Technical Skills: T-SQL, SQL Server, MSBI (SQL Server Integration Services, Reporting Services), Snowflake, Azure Data Factory (ADF), SSIS, Azure Data Lake Storage. Proficient in designing and developing data pipelines, data integration, and data management workflows. Strong understanding of Cloud Data Solutions, with a focus on Azure-based tools and technologies. Nice to Have: Experience with Power BI for data visualization and reporting. Familiarity with Azure Databricks for data processing and advanced analytics. Mandatory Key Skills Azure Data Lake Storage,Business Intelligence,Data Management,T-SQL,Power BI,Azure Databricks,Cloud Data Solutions,Snowflake*,ADF*,SQL Server*,MSBI*,SSIS*
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Data Engineer - Data Solutions Delivery + Data Catalog & Quality Engineer About Advanced Energy Advanced Energy Industries, Inc. (NASDAQ: AEIS), enables design breakthroughs and drives growth for leading semiconductor and industrial customers. Our precision power and control technologies, along with our applications know-how, inspire close partnerships and innovation in thin-film and industrial manufacturing. We are proud of our rich heritage, award-winning technologies, and we value the talents and contributions of all Advanced Energy's employees worldwide. Department: Data and Analytics Team: Data Solutions Delivery Team Job Summary: We are seeking a highly skilled Data Engineer to join our Data and Analytics team. As a member of the Data Solutions Delivery team, you will be responsible for designing, building, and maintaining scalable data solutions. The ideal candidate should have extensive knowledge of Databricks, Azure Data Factory, and Google Cloud, along with strong data warehousing skills from data ingestion to reporting. Familiarity with the manufacturing and supply chain domains is highly desirable. Additionally, the candidate should be well-versed in data engineering, data product, data platform concepts, data mesh, medallion architecture, and establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. The candidate should also have proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc. Key Responsibilities: Design, build, and maintain scalable data solutions using Databricks, ADF, and Google Cloud. Develop and implement data warehousing solutions, including ETL processes, data modeling, and reporting. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Ensure data integrity, quality, and security across all data platforms. Provide expertise in data engineering, data product, and data platform concepts. Implement data mesh principles and medallion architecture to build scalable data platforms. Establish and maintain enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Implement data quality practices using tools like Great Expectations, Deequ, etc. Work closely with the manufacturing and supply chain teams to understand domain-specific data requirements. Develop and maintain documentation for data solutions, data flows, and data models. Act as an individual contributor, picking up tasks from technical solution documents and delivering high-quality results. Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role. In-depth knowledge of Databricks, Azure Data Factory, and Google Cloud. Strong data warehousing skills, including ETL processes, data modelling, and reporting. Familiarity with manufacturing and supply chain domains. Proficiency in data engineering, data product, data platform concepts, data mesh, and medallion architecture. Experience in establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications: Master's degree in a related field. Experience with cloud-based data platforms and tools. Certification in Databricks, Azure, or Google Cloud. As part of our total rewards philosophy, we believe in offering and maintaining competitive compensation and benefits programs for our employees to attract and retain a talented, highly engaged workforce. Our compensation programs are focused on equitable, fair pay practices including market-based base pay, an annual pay-for-performance incentive plan, we offer a strong benefits package in each of the countries in which we operate. Advanced Energy is committed to diversity in its workforce including Equal Employment Opportunity for Minorities, Females, Protected Veterans, and Individuals with Disabilities. We are committed to protecting and respecting your privacy. We take your privacy seriously and will only use your personal information to administer your application in accordance with the RA No. 10173 also known as the Data Privacy Act of 2012 Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us At ANZ, we're applying new ways technology and data can be harnessed as we work towards a common goal: to improve the financial wellbeing and sustainability of our millions of customers. About The Role At ANZ our purpose is to shape a world where people and communities thrive. We’re making this happen by improving our customers’ financial wellbeing so they can achieve incredible things – be it buying their home, building a business or saving for things big or small. Role Type : Permanent Work Location : Bengaluru Work Schedule : Regular Shifts(Hybrid\Blended) Role Purpose : To use analytical and communication skills to elicit, understand, validate and document requirements for business issues, functions and processes from high-level business needs through to detailed solution requirements. The role plays a key role in supporting the delivery of solutions through the SDLC and in Agile delivery methodologies. The Business Analyst may work independently or in a scrum environment on small/ low complexity changes; with guidance as part of BA team on low-medium complexity components of larger initiatives. What will your day look like? Analysis : Elicitation – define and refine requirements Analysis and Problem Solving Process Mapping – current & target state Identify dependencies/risks and issues Use of Stories/Acceptance Criteria Building Domain/Application Knowledge Deliver : Self-Managing – deliver to outcomes as agreed Good communication & documentation Responds to and invites feedback Facilitation Skills – team based Learning ANZ release processes/ADF Learning Agile Ways of Working Connect : Support team/colleagues | Develop relationships | Attends and learns in Guilds/CoPs Lead : Provide and receive feedback | Share knowledge | Support colleagues wellbeing What will you bring? Must have 8+ years of relevant experience banking domain. Establish and drive business and technical requirements gathering Work closely with developers and testers, project managers and business to define the technical requirements Create documentation i.e. use cases, prototypes, technical specifications Facilitate workshops with stakeholders and internally Enjoy working in a cross-functional team, taking on a variety of analyst roles, and have the ability to flex between business and technical Analyse business and stakeholder requirements to define detailed functional and non-functional solution requirements at the required level of detail and rigor sufficient to support development or make a solution decision. Detailed Description Collaborate in Agile/Scrum environments, participating in ceremonies such as sprint planning, backlog grooming, and retrospectives. Translate business requirements into actionable insights and system capabilities, ensuring alignment with enterprise architecture and banking regulations. Demonstrate a strong understanding of banking products, services, and regulatory framework Proactively identify gaps, dependencies, and risks, offering recommendations to optimize processes or technical solutions. Utilize data analysis tools (e.g., SQL, Excel, BI tools) to support data-driven decision-making and validate solution outcomes. Act as a bridge between business and IT, ensuring solutions meet user expectations and business goals. Ensure traceability of requirements through test case creation, validation, and UAT support. Engage in continuous improvement by identifying opportunities to streamline business processes and reduce manual effort. Possess strong stakeholder management and communication skills to influence and gain buy-in from senior business leaders. So why join us? ANZ is a place where big things happen as we work together to provide banking and financial services across more than 30 markets. With more than 7,500 people, our Bengaluru team is the bank's largest technology, data and operations centre outside Australia. In operation for over 33 years, the centre is critical in delivering the bank's strategy and making an impact for our millions of customers around the world. Our Bengaluru team not only drives the transformation initiatives of the bank, it also drives a culture that makes ANZ a great place to be. We're proud that people feel they can be themselves at ANZ and 90 percent of our people feel they belong. We know our people need different things to be great in their role, so we offer a range of flexible working options, including hybrid work (where the role allows it). Our people also enjoy a range of benefits including access to health and wellbeing services. We want to continue building a diverse workplace and welcome applications from everyone. Please talk to us about any adjustments you may require to our recruitment process or the role itself. If you are a candidate with a disability or access requirements, let us know how we can provide you with additional support. To find out more about working at ANZ visit https://www.anz.com/careers/ . You can apply for this role by visiting ANZ Careers and searching for reference number 97323. Job Posting End Date 03/06/2025 , 11.59pm, (Melbourne Australia) Show more Show less
Posted 3 weeks ago
6.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Senior Data Engineer Job Summary We are seeking an experienced and highly motivated Senior Azure Data Engineer to join a Data & Analytics team. The ideal candidate will be a hands-on technical leader responsible for designing, developing, implementing, and managing scalable, robust, and secure data solutions on the Microsoft Azure platform. This role involves leading a team of data engineers, setting technical direction, ensuring the quality and efficiency of data pipelines, and collaborating closely with data scientists, analysts, and business stakeholders to meet data requirements. Key Responsibilities Lead, mentor, and provide technical guidance to a team of Azure Data Engineers. Design, architect, and implement end-to-end data solutions on Azure, including data ingestion, transformation, storage (lakes/warehouses), and serving layers. Oversee and actively participate in the development, testing, and deployment of robust ETL/ELT pipelines using key Azure services. Establish and enforce data engineering best practices, coding standards, data quality checks, and monitoring frameworks. Ensure data solutions are optimized for performance, cost, scalability, security, and reliability. Collaborate effectively with data scientists, analysts, and business stakeholders to understand requirements and deliver effective data solutions. Manage, monitor, and troubleshoot Azure data platform components and pipelines. Contribute to the strategic technical roadmap for the data platform. Experience Qualifications & Experience: Minimum 6-8+ years of overall experience in data engineering roles. Minimum 3-4+ years of hands-on experience designing, implementing, and managing data solutions specifically on the Microsoft Azure cloud platform. Proven experience (1-2+ years) in a lead or senior engineering role, demonstrating mentorship and technical guidance capabilities. Education: Bachelor’s degree in computer science, Engineering, Information Technology, or a related quantitative field (or equivalent practical experience). Technical Skills Core Azure Data Services: Deep expertise in Azure Data Factory (ADF), Azure Synapse Analytics (SQL Pools, Spark Pools), Azure Databricks, Azure Data Lake Storage (ADLS Gen2). Data Processing & Programming Strong proficiency with Spark (using PySpark or Scala) and expert-level SQL skills. Proficiency in Python is highly desired. Data Architecture & Modelling Solid understanding of data warehousing principles (e.g., Kimball), dimensional modelling, ETL/ELT patterns, and data lake design. Databases Experience with relational databases (e.g., Azure SQL Database) and familiarity with NoSQL concepts/databases is beneficial. Version Control Proficiency with Git for code management. Leadership & Soft Skills Excellent leadership, mentoring, problem-solving, and communication skills, with the ability to collaborate effectively across various teams. Skills # Azure Component Proficiency 1 Azure Synapse Analytics High 2 Azure Data Factory High 3 Azure SQL High 4 ADLS Storage High 5 Azure Devops - CICD High 6 Azure Databricks Medium - High 7 Azure Logic App Medium - High 8 Azure Fabric Good to Have, not mandatory 9 Azure Functions Good to Have, not mandatory 10 Azure Purview Good to Have, not mandatory Good experience in Data extraction patterns via ADF – API , Files, Databases. Data Masking in Synapse, RBAC Experience in Data warehousing - Kimbal Modelling. Good communication and collaboration skills. Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.
Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai
The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.
In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.
Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?
As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.