Jobs
Interviews

1640 Adf Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

2 - 3 Lacs

Hyderābād

On-site

Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 25-Jul-2025 Job ID 11122 Description and Requirements Job Description and Requirements Position Summary The MetLife Corporate Technology (CT) organization is evolving to enable MetLife’s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. We are seeking a highly motivated and skilled Azure Data Engineer to join our growing team in Hyderabad. This position is perfect for talented professionals with 4-8 years of experience in designing, building, and maintaining scalable cloud-based data solutions. As an Azure Data Engineer at MetLife, you will collaborate with cross-functional teams to enable data transformation, analytics, and decision-making by leveraging Microsoft Azure’s advanced technologies. He/she should be a strategic thinker, an effective communicator, and an expert in technological development. Key Relationships Internal Stake Holder – Key Responsibilities Design, develop, and maintain efficient and scalable data pipelines using Azure Data Factory (ADF) for ETL/ELT processes. Build and optimize data models and data flows in Azure Synapse Analytics, SQL Databases, and Azure Data Lake. Work with large datasets to define, test, and implement data storage, transformation, and processing strategies using Azure-based services. Create and manage data pipelines for ingesting, processing, and transforming data from various sources into a structured format. Develop solutions for real-time and batch processing using tools like Azure Stream Analytics and Event Hubs. Implement data security, governance, and compliance measures to ensure the integrity and accessibility of the organization’s data assets. Contribute to the migration of on-premises databases and ETL processes to Azure cloud. Build processes to identify, monitor, and resolve data inconsistencies and quality issues. Collaborate with data architects, business analysts, and developers to deliver reliable and performant data solutions aligned with business requirements. Monitor and optimize performance and cost of Azure-based data solutions. Document architectures, data flows, pipelines, and implementations for future reference and knowledge sharing. Knowledge, Skills, and Abilities Education A Bachelors/master's degree in computer science or equivalent Engineering degree. Candidate Qualifications: Education: Bachelor's degree in computer science, Information Systems or related field Experience: Required: 4-8 years of experience in data engineering, with a strong focus on Azure-based services. Proficiency in Azure Data Factory (ADF) , Azure Synapse Analytics, Azure Data Lake, and Azure SQL Databases. Strong knowledge of data modeling, ETL/ELT processes , and data pipeline design. Hands-on experience with Python, SQL, and Spark for data manipulation and transformation. Exposure to big data platforms like Hadoop, Databricks, or similar technologies. Experience with real-time data streaming using tools like Azure Stream Analytics, Event Hubs , or Service Bus. Familiarity with data governance, best practices, and security protocols within cloud environments. Solid understanding of Azure DevOps for CI/CD pipelines around data workflows. Strong problem-solving skills with attention to detail and a results-driven mindset. Excellent collaboration, communication, and interpersonal skills for working with cross-functional teams. Preferred: Demonstrated experience in end-to-end cloud data warehouse migrations . Familiarity with Power BI or other visualization tools for creating dashboards and reports. Certification in Azure Data Engineer Associate or Azure Solutions Architect is a plus. Understanding of machine learning concepts and integrating AI/ML pipelines is an advantage. Skills and Competencies: Language: Proficiency at business level in English. Competencies: Communication: Ability to influence and help communicate the organization’s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment Diverse environment: Can-do attitude and ability to work in a high paced environment Tech Stack Development & Delivery Methods: Agile (Scaled Agile Framework) DevOps and CI/CD: Azure DevOps Development Frameworks and Languages: SQL Spark Python Azure: Functional Knowledge of cloud based solutions About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 1 week ago

Apply

4.0 years

7 - 9 Lacs

Gurgaon

On-site

Why join Stryker? Looking for a place that values your unique talents? Discover Stryker's award-winning culture. We are proud to offer you our total rewards package which includes bonuses, healthcare, insurance benefits, retirement programs, wellness programs, as well as service and performance awards – not to mention various social and recreational activities, all of which are location specific. Job description We are excited to be named one of the World’s Best Workplaces by Fortune Magazine! We are proud to offer you our total rewards package which includes bonuses, healthcare, insurance benefits, retirement programs, wellness programs, as well as service and performance awards – not to mention various social and recreational activities, all of which are location specific. Why join Stryker? We are proud to be named one the World’s Best Workplaces and a Best Workplace for Diversity by Fortune Magazine! Learn more about our award-winning organization by visiting stryker.com Our total rewards package offering includes bonuses, healthcare, insurance benefits, retirement programs, wellness programs, as well as service and performance awards – not to mention various social and recreational activities, all of which are location specific. Position summary: The Service Delivery Analyst for Azure Production Support ensures the successful delivery of services and the attainment of agreed KPIs. This role demands proactivity, teamwork, and a commitment to continuous improvement. The position requires hands-on experience with Azure platforms, including Synapse, Azure Data Factory (ADF), and Databricks. The Service Delivery Analyst manages the execution and delivery of data and analytics services for Stryker. They oversee the analytics and reporting platforms and drive continuous improvements to support the global data analytics strategy and roadmap while collaborating with IT partners to ensure smooth support services. Essential duties & responsibilities: (detailed description) Essential Duties & Responsibilities: Improve, and maintain Azure-based data warehouse solutions. Implement, monitor, and optimize workflows using Azure Synapse, ADF, and Databricks. Manage relationships with IT vendors to ensure optimal service delivery and performance. Offer the best practices, advice and recommendations to the Managed Services team around the overall architecture and strategy of Azure-based solutions. Act as the liaison between technical teams and business stakeholders to ensure effective service delivery. Collaborate with cloud architects and engineers to optimize cost, performance, and security. Assist with onboarding new Azure services and integrating them into existing operations. Investigate and resolve complex technical issues and bugs, ensuring the stability and reliability of the applications and data warehouse solutions. Operations Work closely with the IT Service Delivery Lead and support teams to manage daily support and maintenance of application instances and conduct long-term improvement operations to ensure compatibility with evolving mission requirements. Track and report on service metrics, including SLA compliance and incident resolution timelines. Ensure the service documentation and ensure knowledge base articles are up to date. Continuously improve the customer experience by maturing operations through automation, self-service, and shift-left approaches. Identify service delivery risks and develop mitigation strategies. Identify opportunities to improve the system process flow, performance, and technical efficiencies. Drive adherence to global governance/release management processes to ensure system integrity. Service Excellence: Ensure roadmap for service is defined and communicated with stakeholders using their inputs and demand as drivers. Ensure monitoring of services and related aspects of the service offering (e.g., support procedures) is in place to proactively identify and remediate issues and achieve delivery of service excellence. Ensure proactive root cause analysis and corrective and preventive measures are implemented to ensure delivery of an exceptional service. Monitor vendor performance against agreed expectations (contractual or otherwise) including regular vendor meetings to review expectations and performance. Technical Skills: Azure Platform (Synapse, ADF, Databricks, Power BI) Education & special trainings: Bachelor’s degree required; Master’s degree in computer science or Business Administration preferred ITIL certification preferred Qualifications & experience: 4+ years of experience in cloud service delivery or IT operations, with a focus on Microsoft Azure. Microsoft Azure Fundamentals or higher-level Azure certifications (e.g., AZ-104, AZ-305).Strong understanding of Azure services including Azure Virtual Machines, Azure Active Directory, Azure Monitor, and Azure Resource Manager. Experience in service delivery operations, preferably in an ITIL-based Service Management role like Service Level Management. Ability to develop good working relationships with technical, business, and sales teams using strong communication and team-building skills. Experience working within a managed service model. Ability to analyze numbers, trends, and data to make new conclusions based on findings. Experience working with business leaders. Ability to work effectively in a matrix organization structure, focusing on collaboration and influence rather than command and control. Stryker is a global leader in medical technologies and, together with its customers, is driven to make healthcare better. The company offers innovative products and services in MedSurg, Neurotechnology, Orthopaedics and Spine that help improve patient and healthcare outcomes. Alongside its customers around the world, Stryker impacts more than 150 million patients annually.

Posted 1 week ago

Apply

0 years

1 - 9 Lacs

Chennai

On-site

About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About the role: Gartner is looking for a well-rounded and motivated developer to join its Conferences Technology & Insight Analytics team, which is responsible for developing the reporting and analytics to support its Conference reporting operations. What you will do: Collaborate with business stakeholders, design, build advanced analytic solutions for Gartner Conference Technology Business. Execution of our data strategy through design and development of Data platforms to deliver Reporting, BI and Advanced Analytics solutions. Design and development of key analytics capabilities using MS SQL Server , Azure SQL Managed Instance , T-SQL & ADF on Azure Platform. Consistently improving and optimizing T-SQL performance across the entire analytics platform. Create, build, and implement comprehensive data integration solutions utilizing Azure Data Factory. Analysing and solving complex business problems, breaking down the work into actionable tasks. Develop, maintain, and document data dictionary and data flow diagrams Responsible for building and enhancing the regression test suite to monitor nightly ETL jobs and identify data issues. Work alongside project managers, cross teams to support fast paced Agile/Scrum environment. Build Optimized solutions and designs to handle Big Data. Follow coding standards, build appropriate unit tests, integration tests, deployment scripts and review project artifacts created by peers. Contribute to overall growth by suggesting improvements to the existing software architecture or introducing new technologies. What you will need : Strong IT professional with high-end knowledge on Designing and Development of E2E BI & Analytics projects in a global enterprise environment. The candidate should have strong qualitative and quantitative problem-solving abilities and is expected to yield ownership and accountability. Must have: Strong experience with SQL, including diagnosing and resolving load failures, constructing hierarchical queries, and efficiently analysing existing SQL code to identify and resolve issues, using Microsoft Azure SQL Database, SQL Server, and Azure SQL Managed Instance. Ability to create and modifying various database objects such as stored procedures, views, tables, triggers, indexes using Microsoft Azure SQL Database, SQL Server, Azure SQL Managed Instance. Deep understanding in writing Advance SQL code (Analytic functions). Strong technical experience with Database performance and tuning, troubleshooting and query optimization. Strong technical experience with Azure Data Factory on Azure Platform. Create and manage complex ETL pipelines to extract, transform, and load data from various sources using Azure Data Factory. Monitor and troubleshoot data pipeline issues to ensure data integrity and availability. Enhance data workflows to improve performance, scalability, and cost-effectiveness. Establish best practices for data governance and security within data pipelines. Experience in Cloud Platforms, Azure technologies like Azure Analysis Services, Azure Blob Storage, Azure Data Lake, Azure Delta Lake etc Experience with data modelling, database design, and data warehousing concepts and Data Lake. Ensure thorough documentation of data processes, configurations, and operational procedures. Who you are: Graduate/Post-graduate in BE/Btech, ME/MTech or MCA is preferred IT Professional with 5-7 yrs of experience in Data analytics, Cloud technologies and ETL development Excellent communication and prioritization skills Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment Strong desire to improve upon their skills in software development, frameworks, and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. #LI-PM2 Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:101325 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

ob Title: Azure Kubernetes Architect and Administrator (L3 Capacity, Managed Services) Key Responsibilities: Azure Kubernetes Service (AKS) : Architect, manage, and optimize Kubernetes clusters on Azure, ensuring scalability, security, and high availability. Azure Infrastructure and Platform Services : IaaS: Design and implement robust Azure-based infrastructure for critical BFSI applications. PaaS: Optimize the use of Azure PaaS services, including App Services, Azure SQL Database, and Service Fabric. Security & Compliance : Ensure adherence to BFSI industry standards by implementing advanced security measures (e.g., Azure Security Center, role-based access control, encryption protocols). Cost Optimization : Analyze and optimize Azure resource usage to minimize costs while maintaining performance and compliance standards. Automation : Develop CI/CD pipelines and automate workflows using tools like Terraform, Helm, and Azure DevOps. Process Improvements : Continuously identify areas for operational enhancements in line with BFSI-specific needs. Collaboration : Partner with cross-functional teams to support deployment, monitoring, troubleshooting, and the lifecycle management of applications. Required Skills: Expertise in Azure Kubernetes Service (AKS), Azure IaaS and PaaS, and container orchestration. Strong knowledge of cloud security principles and tools such as Azure Security Center and Azure Key Vault. Proficiency in scripting languages like Python, Bash, or PowerShell. Familiarity with cost management tools such as Azure Cost Management + Billing. Experience in monitoring with Prometheus and Grafana. Understanding of BFSI compliance regulations and standards. Process improvement experience using frameworks like Lean, Six Sigma, or similar methodologies. Qualifications: Bachelor\'s degree in Computer Science, Engineering, or related field. Certifications like Azure Solutions Architect, Certified Kubernetes Administrator (CKA), or Certified Azure DevOps Engineer are advantageous. Minimum 5 years of hands-on experience in Azure and Kubernetes environments within BFSI or similar industries. Expertise in AKS, Azure IaaS, PaaS, and security tools like Azure Security Center. Proficiency in scripting (Python, Bash, PowerShell). Strong understanding of BFSI compliance standards. Experience with monitoring tools such as Prometheus, Grafana, New Relic, Azure Log Analytics, and ADF. Skilled in cost management using Azure Cost Management tools. Knowledge of ServiceNow ITSM, Freshworks ITSM, change management, team leadership, and process improvement frameworks like Lean or Six Sigma.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Why join Stryker? We are proud to be named one the World’s Best Workplaces and a Best Workplace for Diversity by Fortune Magazine! Learn more about our award-winning organization by visiting stryker.com Our total rewards package offering includes bonuses, healthcare, insurance benefits, retirement programs, wellness programs, as well as service and performance awards – not to mention various social and recreational activities, all of which are location specific. Position summary: The Service Delivery Analyst for Azure Production Support ensures the successful delivery of services and the attainment of agreed KPIs. This role demands proactivity, teamwork, and a commitment to continuous improvement. The position requires hands-on experience with Azure platforms, including Synapse, Azure Data Factory (ADF), and Databricks. The Service Delivery Analyst manages the execution and delivery of data and analytics services for Stryker. They oversee the analytics and reporting platforms and drive continuous improvements to support the global data analytics strategy and roadmap while collaborating with IT partners to ensure smooth support services. Essential duties & responsibilities: (detailed description) Essential Duties & Responsibilities: Improve, and maintain Azure-based data warehouse solutions. Implement, monitor, and optimize workflows using Azure Synapse, ADF, and Databricks. Manage relationships with IT vendors to ensure optimal service delivery and performance. Offer the best practices, advice and recommendations to the Managed Services team around the overall architecture and strategy of Azure-based solutions. Act as the liaison between technical teams and business stakeholders to ensure effective service delivery. Collaborate with cloud architects and engineers to optimize cost, performance, and security. Assist with onboarding new Azure services and integrating them into existing operations. Investigate and resolve complex technical issues and bugs, ensuring the stability and reliability of the applications and data warehouse solutions. Operations Work closely with the IT Service Delivery Lead and support teams to manage daily support and maintenance of application instances and conduct long-term improvement operations to ensure compatibility with evolving mission requirements. Track and report on service metrics, including SLA compliance and incident resolution timelines. Ensure the service documentation and ensure knowledge base articles are up to date. Continuously improve the customer experience by maturing operations through automation, self-service, and shift-left approaches. Identify service delivery risks and develop mitigation strategies. Identify opportunities to improve the system process flow, performance, and technical efficiencies. Drive adherence to global governance/release management processes to ensure system integrity. Service Excellence: Ensure roadmap for service is defined and communicated with stakeholders using their inputs and demand as drivers. Ensure monitoring of services and related aspects of the service offering (e.g., support procedures) is in place to proactively identify and remediate issues and achieve delivery of service excellence. Ensure proactive root cause analysis and corrective and preventive measures are implemented to ensure delivery of an exceptional service. Monitor vendor performance against agreed expectations (contractual or otherwise) including regular vendor meetings to review expectations and performance. Technical Skills: Azure Platform (Synapse, ADF, Databricks, Power BI) Education & special trainings: Bachelor’s degree required; Master’s degree in computer science or Business Administration preferred ITIL certification preferred Qualifications & experience: 4+ years of experience in cloud service delivery or IT operations, with a focus on Microsoft Azure. Microsoft Azure Fundamentals or higher-level Azure certifications (e.g., AZ-104, AZ-305).Strong understanding of Azure services including Azure Virtual Machines, Azure Active Directory, Azure Monitor, and Azure Resource Manager. Experience in service delivery operations, preferably in an ITIL-based Service Management role like Service Level Management. Ability to develop good working relationships with technical, business, and sales teams using strong communication and team-building skills. Experience working within a managed service model. Ability to analyze numbers, trends, and data to make new conclusions based on findings. Experience working with business leaders. Ability to work effectively in a matrix organization structure, focusing on collaboration and influence rather than command and control. Stryker is a global leader in medical technologies and, together with its customers, is driven to make healthcare better. The company offers innovative products and services in MedSurg, Neurotechnology, Orthopaedics and Spine that help improve patient and healthcare outcomes. Alongside its customers around the world, Stryker impacts more than 150 million patients annually.

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About the role: Gartner is looking for a well-rounded and motivated developer to join its Conferences Technology & Insight Analytics team, which is responsible for developing the reporting and analytics to support its Conference reporting operations. What you will do: Collaborate with business stakeholders, design, build advanced analytic solutions for Gartner Conference Technology Business. Execution of our data strategy through design and development of Data platforms to deliver Reporting, BI and Advanced Analytics solutions. Design and development of key analytics capabilities using MS SQL Server, Azure SQL Managed Instance, T-SQL & ADF on Azure Platform. Consistently improving and optimizing T-SQL performance across the entire analytics platform. Create, build, and implement comprehensive data integration solutions utilizing Azure Data Factory. Analysing and solving complex business problems, breaking down the work into actionable tasks. Develop, maintain, and document data dictionary and data flow diagrams Responsible for building and enhancing the regression test suite to monitor nightly ETL jobs and identify data issues. Work alongside project managers, cross teams to support fast paced Agile/Scrum environment. Build Optimized solutions and designs to handle Big Data. Follow coding standards, build appropriate unit tests, integration tests, deployment scripts and review project artifacts created by peers. Contribute to overall growth by suggesting improvements to the existing software architecture or introducing new technologies. What you will need : Strong IT professional with high-end knowledge on Designing and Development of E2E BI & Analytics projects in a global enterprise environment. The candidate should have strong qualitative and quantitative problem-solving abilities and is expected to yield ownership and accountability. Must have: Strong experience with SQL, including diagnosing and resolving load failures, constructing hierarchical queries, and efficiently analysing existing SQL code to identify and resolve issues, using Microsoft Azure SQL Database, SQL Server, and Azure SQL Managed Instance. Ability to create and modifying various database objects such as stored procedures, views, tables, triggers, indexes using Microsoft Azure SQL Database, SQL Server, Azure SQL Managed Instance. Deep understanding in writing Advance SQL code (Analytic functions). Strong technical experience with Database performance and tuning, troubleshooting and query optimization. Strong technical experience with Azure Data Factory on Azure Platform. Create and manage complex ETL pipelines to extract, transform, and load data from various sources using Azure Data Factory. Monitor and troubleshoot data pipeline issues to ensure data integrity and availability. Enhance data workflows to improve performance, scalability, and cost-effectiveness. Establish best practices for data governance and security within data pipelines. Experience in Cloud Platforms, Azure technologies like Azure Analysis Services, Azure Blob Storage, Azure Data Lake, Azure Delta Lake etc Experience with data modelling, database design, and data warehousing concepts and Data Lake. Ensure thorough documentation of data processes, configurations, and operational procedures. Who you are: Graduate/Post-graduate in BE/Btech, ME/MTech or MCA is preferred IT Professional with 5-7 yrs of experience in Data analytics, Cloud technologies and ETL development Excellent communication and prioritization skills Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment Strong desire to improve upon their skills in software development, frameworks, and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work. What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com. Job Requisition ID:101325 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About the role: Gartner is looking for a well-rounded and motivated developer to join its Conferences Technology & Insight Analytics team, which is responsible for developing the reporting and analytics to support its Conference reporting operations. What you will do: Collaborate with business stakeholders, design, build advanced analytic solutions for Gartner Conference Technology Business. Execution of our data strategy through design and development of Data platforms to deliver Reporting, BI and Advanced Analytics solutions. Design and development of key analytics capabilities using MS SQL Server, Azure SQL Managed Instance, T-SQL & ADF on Azure Platform. Consistently improving and optimizing T-SQL performance across the entire analytics platform. Create, build, and implement comprehensive data integration solutions utilizing Azure Data Factory. Analysing and solving complex business problems, breaking down the work into actionable tasks. Develop, maintain, and document data dictionary and data flow diagrams Responsible for building and enhancing the regression test suite to monitor nightly ETL jobs and identify data issues. Work alongside project managers, cross teams to support fast paced Agile/Scrum environment. Build Optimized solutions and designs to handle Big Data. Follow coding standards, build appropriate unit tests, integration tests, deployment scripts and review project artifacts created by peers. Contribute to overall growth by suggesting improvements to the existing software architecture or introducing new technologies. What you will need : Strong IT professional with high-end knowledge on Designing and Development of E2E BI & Analytics projects in a global enterprise environment. The candidate should have strong qualitative and quantitative problem-solving abilities and is expected to yield ownership and accountability. Must have: Strong experience with SQL, including diagnosing and resolving load failures, constructing hierarchical queries, and efficiently analysing existing SQL code to identify and resolve issues, using Microsoft Azure SQL Database, SQL Server, and Azure SQL Managed Instance. Ability to create and modifying various database objects such as stored procedures, views, tables, triggers, indexes using Microsoft Azure SQL Database, SQL Server, Azure SQL Managed Instance. Deep understanding in writing Advance SQL code (Analytic functions). Strong technical experience with Database performance and tuning, troubleshooting and query optimization. Strong technical experience with Azure Data Factory on Azure Platform. Create and manage complex ETL pipelines to extract, transform, and load data from various sources using Azure Data Factory. Monitor and troubleshoot data pipeline issues to ensure data integrity and availability. Enhance data workflows to improve performance, scalability, and cost-effectiveness. Establish best practices for data governance and security within data pipelines. Experience in Cloud Platforms, Azure technologies like Azure Analysis Services, Azure Blob Storage, Azure Data Lake, Azure Delta Lake etc Experience with data modelling, database design, and data warehousing concepts and Data Lake. Ensure thorough documentation of data processes, configurations, and operational procedures. Who you are: Graduate/Post-graduate in BE/Btech, ME/MTech or MCA is preferred IT Professional with 5-7 yrs of experience in Data analytics, Cloud technologies and ETL development Excellent communication and prioritization skills Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment Strong desire to improve upon their skills in software development, frameworks, and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work. What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com. Job Requisition ID:101323 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About the role: Gartner is looking for a well-rounded and motivated developer to join its Conferences Technology & Insight Analytics team, which is responsible for developing the reporting and analytics to support its Conference reporting operations. What you will do: Collaborate with business stakeholders, design, build advanced analytic solutions for Gartner Conference Technology Business. Execution of our data strategy through design and development of Data platforms to deliver Reporting, BI and Advanced Analytics solutions. Design and development of key analytics capabilities using MS SQL Server, Azure SQL Managed Instance, T-SQL & ADF on Azure Platform. Consistently improving and optimizing T-SQL performance across the entire analytics platform. Create, build, and implement comprehensive data integration solutions utilizing Azure Data Factory. Analysing and solving complex business problems, breaking down the work into actionable tasks. Develop, maintain, and document data dictionary and data flow diagrams Responsible for building and enhancing the regression test suite to monitor nightly ETL jobs and identify data issues. Work alongside project managers, cross teams to support fast paced Agile/Scrum environment. Build Optimized solutions and designs to handle Big Data. Follow coding standards, build appropriate unit tests, integration tests, deployment scripts and review project artifacts created by peers. Contribute to overall growth by suggesting improvements to the existing software architecture or introducing new technologies. What you will need : Strong IT professional with high-end knowledge on Designing and Development of E2E BI & Analytics projects in a global enterprise environment. The candidate should have strong qualitative and quantitative problem-solving abilities and is expected to yield ownership and accountability. Must have: Strong experience with SQL, including diagnosing and resolving load failures, constructing hierarchical queries, and efficiently analysing existing SQL code to identify and resolve issues, using Microsoft Azure SQL Database, SQL Server, and Azure SQL Managed Instance. Ability to create and modifying various database objects such as stored procedures, views, tables, triggers, indexes using Microsoft Azure SQL Database, SQL Server, Azure SQL Managed Instance. Deep understanding in writing Advance SQL code (Analytic functions). Strong technical experience with Database performance and tuning, troubleshooting and query optimization. Strong technical experience with Azure Data Factory on Azure Platform. Create and manage complex ETL pipelines to extract, transform, and load data from various sources using Azure Data Factory. Monitor and troubleshoot data pipeline issues to ensure data integrity and availability. Enhance data workflows to improve performance, scalability, and cost-effectiveness. Establish best practices for data governance and security within data pipelines. Experience in Cloud Platforms, Azure technologies like Azure Analysis Services, Azure Blob Storage, Azure Data Lake, Azure Delta Lake etc Experience with data modelling, database design, and data warehousing concepts and Data Lake. Ensure thorough documentation of data processes, configurations, and operational procedures. Who you are: Graduate/Post-graduate in BE/Btech, ME/MTech or MCA is preferred IT Professional with 5-7 yrs of experience in Data analytics, Cloud technologies and ETL development Excellent communication and prioritization skills Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment Strong desire to improve upon their skills in software development, frameworks, and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work. What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com. Job Requisition ID:101324 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Experience in designing Azure IPaaS solutions with a focus on re-usability and loosely coupled architecture. Capable of leading a larger team and ensuring delivery with zero issues. Strong technical expertise in Azure IPaaS components including LogicApps, FunctionApps, APIM, ServiceBus, EventHub, EventGrid, ADF, and KeyVaults. Proficient in creating and maintaining automated build and release pipelines (DevOps CI/CD pipelines). Hands-on experience in designing and developing microservices-based architecture. Sound knowledge of Azure IaaS, PaaS, and SaaS. Experience in designing Azure IPaaS solutions with an emphasis on re-usability and loosely coupled architecture. Ability to lead a larger team and ensure seamless delivery. Proficiency in Azure IPaaS components such as LogicApps, FunctionApps, APIM, ServiceBus, EventHub, EventGrid, ADF, and KeyVaults. Familiarity with creating and managing automated build and release pipelines (DevOps CI/CD pipelines). Hands-on experience in designing and implementing microservices-based architecture. Strong understanding of Azure IaaS, PaaS, and SaaS.,

Posted 1 week ago

Apply

5.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Below are examples of role/skills profiles used by the UK firm when hiring Data Analytics based roles indicated above. Job Description & Summary Operate is the firm's delivery engine, serving as the orchestrator of services across the organisation. It is a global team of delivery professionals united by a commitment to excellence and impact. Operate has built a strong reputation for collaboration, mobilising quickly, and effectively getting tasks done. It aims to build a world-class delivery capability, focusing on evolving operational delivery, embedding automation and AI, and raising the bar for quality and consistency. The goal is to add strategic value for clients and contribute to the firm’s ambition of pre-eminence in the market. Team members in Operate are provided with meaningful opportunities to lead, learn, and grow, embracing a future-ready workforce trained in cutting-edge technology. Operate ensures clients can access a single front door to global delivery chains, providing tailored, high-quality solutions to meet evolving challenges. The role will be based in Kolkata. However, with a diverse range of clients and projects, you'll occasionally have the exciting opportunity to work in various locations, offering exposure to different industries and cultures. This flexibility opens doors to unique networking experiences and accelerated career growth, enriching your professional journey. Your willingness and ability to do this will be discussed as part of the recruitment process. Candidates who prefer not to travel will still be considered. Role Description As a pivotal member of our data team, Senior Associates are key in shaping and refining data management and analytics functions, including our expanding Data Services. You will be instrumental in helping us deliver value-driven insights by designing, integrating, and analysing cutting-edge data systems. The role emphasises leveraging the latest technologies, particularly within the Microsoft ecosystem, to enhance operational capabilities and drive innovation. You'll work on diverse and challenging projects, allowing you to actively influence strategic decisions and develop innovative solutions. This, in turn, paves the way for unparalleled professional growth and the development of a forward-thinking mindset. As you contribute to our Data Services, you'll have a front-row seat to the future of data analytics, providing an enriching environment to build expertise and expand your career horizons. Key Activities Include, But Are Not Limited To Design and implement data integration processes. Manage data projects with multiple stakeholders and tight timelines. Developing data models and frameworks that enhance data governance and efficiency. Addressing challenges related to data integration, quality, and management processes. Implementing best practices in automation to streamline data workflows. Engaging with key stakeholders to extract, interpret, and translate data requirements into meaningful insights and solutions. Engage with clients to understand and deliver data solutions. Work collaboratively to meet project goals. Lead and mentor junior team members. Essential Requirements More than 5 years of experience in data analytics, with proficiency in managing large datasets and crafting detailed reports. Proficient in Python Experience working within a Microsoft Azure environment. Experience with data warehousing and data modelling (e.g., dimensional modelling, data mesh, data fabric). Proficiency in PySpark/Databricks/Snowflake/MS Fabric, and intermediate SQL skills. Experience with orchestration tools such as Azure Data Factory (ADF), Airflow, or DBT. Familiarity with DevOps practices, specifically creating CI/CD and release pipelines. Knowledge of Azure DevOps tools and GitHub. Knowledge of Azure SQL DB or any other RDBMS system. Basic knowledge of GenAI. Additional Skills / Experiences That Will Be Beneficial Understanding of data governance frameworks. Awareness of Power Automate functionalities. Why Join Us? This role isn't just about the technical expertise—it’s about being part of something transformational. You'll be part of a vibrant team where growth opportunities are vast and where your contributions directly impact our mission to break new ground in data services. With a work culture that values innovation, collaboration, and personal growth, joining PwC's Operate Data Analytics team offers you the chance to shape the future of operational and data service solutions with creativity and foresight. Dive into exciting projects, challenge the status quo, and drive the narrative forward!

Posted 1 week ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Below are examples of role/skills profiles used by the UK firm when hiring Data Analytics based roles indicated above. Job Description & Summary Operate is the firm's delivery engine, serving as the orchestrator of services across the organisation. It is a global team of delivery professionals united by a commitment to excellence and impact. Operate has built a strong reputation for collaboration, mobilising quickly, and effectively getting tasks done. It aims to build a world-class delivery capability, focusing on evolving operational delivery, embedding automation and AI, and raising the bar for quality and consistency. The goal is to add strategic value for clients and contribute to the firm’s ambition of pre-eminence in the market. Team members in Operate are provided with meaningful opportunities to lead, learn, and grow, embracing a future-ready workforce trained in cutting-edge technology. Operate ensures clients can access a single front door to global delivery chains, providing tailored, high-quality solutions to meet evolving challenges. The role will be based in Kolkata. However, with a diverse range of clients and projects, you'll occasionally have the exciting opportunity to work in various locations, offering exposure to different industries and cultures. This flexibility opens doors to unique networking experiences and accelerated career growth, enriching your professional journey. Your willingness and ability to do this will be discussed as part of the recruitment process. Candidates who prefer not to travel will still be considered. Role Description As an integral part of our data team, Associate 2 professionals contribute significantly to the development of data management and analytics functions, including our growing Data Services. In this role, you'll assist engagement teams in delivering meaningful insights by helping design, integrate, and analyse data systems. You will work with the latest technologies, especially within the Microsoft ecosystem, to enhance our operational capabilities. Working on a variety of projects, you'll have the chance to contribute your ideas and support innovative solutions. This experience offers opportunities for professional growth and helps cultivate a forward-thinking mindset. As you support our Data Services, you'll gain exposure to the evolving field of data analytics, providing an excellent foundation for building expertise and expanding your career journey. Key Activities Include, But Are Not Limited To Assisting in the development of data models and frameworks to enhance data governance and efficiency. Supporting efforts to address data integration, quality, and management process challenges. Participating in the implementation of best practices in automation to streamline data workflows. Collaborating with stakeholders to gather, interpret, and translate data requirements into practical insights and solutions. Support management of data projects alongside senior team members. Assist in engaging with clients to understand their data needs. Work effectively as part of a team to achieve project goals. Essential Requirements At least two years of experience in data analytics, with a focus on handling large datasets and supporting the creation of detailed reports. Familiarity with Python and experience in working within a Microsoft Azure environment. Exposure to data warehousing and data modelling techniques (e.g., dimensional modelling). Basic proficiency in PySpark and Databricks/Snowflake/MS Fabric, with foundational SQL skills. Experience with orchestration tools like Azure Data Factory (ADF), Airflow, or DBT. Awareness of DevOps practices, including introducing CI/CD and release pipelines. Familiarity with Azure DevOps tools and GitHub. Basic understanding of Azure SQL DB or other RDBMS systems. Introductory knowledge of GenAI concepts. Additional Skills / Experiences That Will Be Beneficial Understanding of data governance frameworks. Awareness of Power Automate functionalities. WHY JOIN US? This role is not just about the technical expertise—it’s about being part of something transformational. You'll be part of a vibrant team where growth opportunities are vast and where your contributions directly impact our mission to break new ground in data services. With a work culture that values innovation, collaboration, and personal growth, joining PwC's Operate Data Analytics team offers you the chance to shape the future of operational and data service solutions with creativity and foresight. Dive into exciting projects, challenge the status quo, and drive the narrative forward!

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Below are examples of role/skills profiles used by the UK firm when hiring Data Analytics based roles indicated above. Job Description & Summary Operate is the firm's delivery engine, serving as the orchestrator of services across the organisation. It is a global team of delivery professionals united by a commitment to excellence and impact. Operate has built a strong reputation for collaboration, mobilising quickly, and effectively getting tasks done. It aims to build a world-class delivery capability, focusing on evolving operational delivery, embedding automation and AI, and raising the bar for quality and consistency. The goal is to add strategic value for clients and contribute to the firm’s ambition of pre-eminence in the market. Team members in Operate are provided with meaningful opportunities to lead, learn, and grow, embracing a future-ready workforce trained in cutting-edge technology. Operate ensures clients can access a single front door to global delivery chains, providing tailored, high-quality solutions to meet evolving challenges. The role will be based in Kolkata. However, with a diverse range of clients and projects, you'll occasionally have the exciting opportunity to work in various locations, offering exposure to different industries and cultures. This flexibility opens doors to unique networking experiences and accelerated career growth, enriching your professional journey. Your willingness and ability to do this will be discussed as part of the recruitment process. Candidates who prefer not to travel will still be considered. Role Description As an integral part of our data team, Associate 2 professionals contribute significantly to the development of data management and analytics functions, including our growing Data Services. In this role, you'll assist engagement teams in delivering meaningful insights by helping design, integrate, and analyse data systems. You will work with the latest technologies, especially within the Microsoft ecosystem, to enhance our operational capabilities. Working on a variety of projects, you'll have the chance to contribute your ideas and support innovative solutions. This experience offers opportunities for professional growth and helps cultivate a forward-thinking mindset. As you support our Data Services, you'll gain exposure to the evolving field of data analytics, providing an excellent foundation for building expertise and expanding your career journey. Key Activities Include, But Are Not Limited To Assisting in the development of data models and frameworks to enhance data governance and efficiency. Supporting efforts to address data integration, quality, and management process challenges. Participating in the implementation of best practices in automation to streamline data workflows. Collaborating with stakeholders to gather, interpret, and translate data requirements into practical insights and solutions. Support management of data projects alongside senior team members. Assist in engaging with clients to understand their data needs. Work effectively as part of a team to achieve project goals. Essential Requirements At least two years of experience in data analytics, with a focus on handling large datasets and supporting the creation of detailed reports. Familiarity with Python and experience in working within a Microsoft Azure environment. Exposure to data warehousing and data modelling techniques (e.g., dimensional modelling). Basic proficiency in PySpark and Databricks/Snowflake/MS Fabric, with foundational SQL skills. Experience with orchestration tools like Azure Data Factory (ADF), Airflow, or DBT. Awareness of DevOps practices, including introducing CI/CD and release pipelines. Familiarity with Azure DevOps tools and GitHub. Basic understanding of Azure SQL DB or other RDBMS systems. Introductory knowledge of GenAI concepts. Additional Skills / Experiences That Will Be Beneficial Understanding of data governance frameworks. Awareness of Power Automate functionalities. WHY JOIN US? This role is not just about the technical expertise—it’s about being part of something transformational. You'll be part of a vibrant team where growth opportunities are vast and where your contributions directly impact our mission to break new ground in data services. With a work culture that values innovation, collaboration, and personal growth, joining PwC's Operate Data Analytics team offers you the chance to shape the future of operational and data service solutions with creativity and foresight. Dive into exciting projects, challenge the status quo, and drive the narrative forward!

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Below are examples of role/skills profiles used by the UK firm when hiring Data Analytics based roles indicated above. Job Description & Summary Operate is the firm's delivery engine, serving as the orchestrator of services across the organisation. It is a global team of delivery professionals united by a commitment to excellence and impact. Operate has built a strong reputation for collaboration, mobilising quickly, and effectively getting tasks done. It aims to build a world-class delivery capability, focusing on evolving operational delivery, embedding automation and AI, and raising the bar for quality and consistency. The goal is to add strategic value for clients and contribute to the firm’s ambition of pre-eminence in the market. Team members in Operate are provided with meaningful opportunities to lead, learn, and grow, embracing a future-ready workforce trained in cutting-edge technology. Operate ensures clients can access a single front door to global delivery chains, providing tailored, high-quality solutions to meet evolving challenges. The role will be based in Kolkata. However, with a diverse range of clients and projects, you'll occasionally have the exciting opportunity to work in various locations, offering exposure to different industries and cultures. This flexibility opens doors to unique networking experiences and accelerated career growth, enriching your professional journey. Your willingness and ability to do this will be discussed as part of the recruitment process. Candidates who prefer not to travel will still be considered. Role Description As a pivotal member of our data team, Senior Associates are key in shaping and refining data management and analytics functions, including our expanding Data Services. You will be instrumental in helping us deliver value-driven insights by designing, integrating, and analysing cutting-edge data systems. The role emphasises leveraging the latest technologies, particularly within the Microsoft ecosystem, to enhance operational capabilities and drive innovation. You'll work on diverse and challenging projects, allowing you to actively influence strategic decisions and develop innovative solutions. This, in turn, paves the way for unparalleled professional growth and the development of a forward-thinking mindset. As you contribute to our Data Services, you'll have a front-row seat to the future of data analytics, providing an enriching environment to build expertise and expand your career horizons. Key Activities Include, But Are Not Limited To Design and implement data integration processes. Manage data projects with multiple stakeholders and tight timelines. Developing data models and frameworks that enhance data governance and efficiency. Addressing challenges related to data integration, quality, and management processes. Implementing best practices in automation to streamline data workflows. Engaging with key stakeholders to extract, interpret, and translate data requirements into meaningful insights and solutions. Engage with clients to understand and deliver data solutions. Work collaboratively to meet project goals. Lead and mentor junior team members. Essential Requirements More than 5 years of experience in data analytics, with proficiency in managing large datasets and crafting detailed reports. Proficient in Python Experience working within a Microsoft Azure environment. Experience with data warehousing and data modelling (e.g., dimensional modelling, data mesh, data fabric). Proficiency in PySpark/Databricks/Snowflake/MS Fabric, and intermediate SQL skills. Experience with orchestration tools such as Azure Data Factory (ADF), Airflow, or DBT. Familiarity with DevOps practices, specifically creating CI/CD and release pipelines. Knowledge of Azure DevOps tools and GitHub. Knowledge of Azure SQL DB or any other RDBMS system. Basic knowledge of GenAI. Additional Skills / Experiences That Will Be Beneficial Understanding of data governance frameworks. Awareness of Power Automate functionalities. Why Join Us? This role isn't just about the technical expertise—it’s about being part of something transformational. You'll be part of a vibrant team where growth opportunities are vast and where your contributions directly impact our mission to break new ground in data services. With a work culture that values innovation, collaboration, and personal growth, joining PwC's Operate Data Analytics team offers you the chance to shape the future of operational and data service solutions with creativity and foresight. Dive into exciting projects, challenge the status quo, and drive the narrative forward!

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Below are examples of role/skills profiles used by the UK firm when hiring Data Analytics based roles indicated above. Job Description & Summary Operate is the firm's delivery engine, serving as the orchestrator of services across the organisation. It is a global team of delivery professionals united by a commitment to excellence and impact. Operate has built a strong reputation for collaboration, mobilising quickly, and effectively getting tasks done. It aims to build a world-class delivery capability, focusing on evolving operational delivery, embedding automation and AI, and raising the bar for quality and consistency. The goal is to add strategic value for clients and contribute to the firm’s ambition of pre-eminence in the market. Team members in Operate are provided with meaningful opportunities to lead, learn, and grow, embracing a future-ready workforce trained in cutting-edge technology. Operate ensures clients can access a single front door to global delivery chains, providing tailored, high-quality solutions to meet evolving challenges. The role will be based in Kolkata. However, with a diverse range of clients and projects, you'll occasionally have the exciting opportunity to work in various locations, offering exposure to different industries and cultures. This flexibility opens doors to unique networking experiences and accelerated career growth, enriching your professional journey. Your willingness and ability to do this will be discussed as part of the recruitment process. Candidates who prefer not to travel will still be considered. Role Description As an integral part of our data team, Associate 2 professionals contribute significantly to the development of data management and analytics functions, including our growing Data Services. In this role, you'll assist engagement teams in delivering meaningful insights by helping design, integrate, and analyse data systems. You will work with the latest technologies, especially within the Microsoft ecosystem, to enhance our operational capabilities. Working on a variety of projects, you'll have the chance to contribute your ideas and support innovative solutions. This experience offers opportunities for professional growth and helps cultivate a forward-thinking mindset. As you support our Data Services, you'll gain exposure to the evolving field of data analytics, providing an excellent foundation for building expertise and expanding your career journey. Key Activities Include, But Are Not Limited To Assisting in the development of data models and frameworks to enhance data governance and efficiency. Supporting efforts to address data integration, quality, and management process challenges. Participating in the implementation of best practices in automation to streamline data workflows. Collaborating with stakeholders to gather, interpret, and translate data requirements into practical insights and solutions. Support management of data projects alongside senior team members. Assist in engaging with clients to understand their data needs. Work effectively as part of a team to achieve project goals. Essential Requirements At least two years of experience in data analytics, with a focus on handling large datasets and supporting the creation of detailed reports. Familiarity with Python and experience in working within a Microsoft Azure environment. Exposure to data warehousing and data modelling techniques (e.g., dimensional modelling). Basic proficiency in PySpark and Databricks/Snowflake/MS Fabric, with foundational SQL skills. Experience with orchestration tools like Azure Data Factory (ADF), Airflow, or DBT. Awareness of DevOps practices, including introducing CI/CD and release pipelines. Familiarity with Azure DevOps tools and GitHub. Basic understanding of Azure SQL DB or other RDBMS systems. Introductory knowledge of GenAI concepts. Additional Skills / Experiences That Will Be Beneficial Understanding of data governance frameworks. Awareness of Power Automate functionalities. WHY JOIN US? This role is not just about the technical expertise—it’s about being part of something transformational. You'll be part of a vibrant team where growth opportunities are vast and where your contributions directly impact our mission to break new ground in data services. With a work culture that values innovation, collaboration, and personal growth, joining PwC's Operate Data Analytics team offers you the chance to shape the future of operational and data service solutions with creativity and foresight. Dive into exciting projects, challenge the status quo, and drive the narrative forward!

Posted 1 week ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Key Responsibilities: Partner with business, product, and engineering teams to define problem statements, evaluate feasibility, and design AI/ML-driven solutions that deliver measurable business value. Lead and execute end-to-end AI/ML projects — from data exploration and model development to validation, deployment, and monitoring in production. Drive solution architecture using techniques in data engineering, programming, machine learning, NLP, and Generative AI. Champion the scalability, reproducibility, and sustainability of AI solutions by establishing best practices in model development, CI/CD, and performance tracking. Guide junior and associate AI/ML engineers through technical mentoring, code reviews, and solution reviews. Identify and evangelize the adoption of emerging tools, technologies, and methodologies across teams. Translate technical outputs into actionable insights for business stakeholders through storytelling, data visualizations, and stakeholder engagement. We are looking for: A seasoned AI/ML engineer with 7+ years of hands-on experience delivering enterprise-grade AI/ML solutions. Advanced proficiency in Python, SQL, PySpark, and experience working with cloud platforms (Azure preferred) and tools such as Databricks, Synapse, ADF, and Web Apps. Strong expertise in applied text analytics, NLP, and Generative AI, with real-world deployment exposure. Solid understanding of model evaluation, optimization, bias mitigation, and monitoring in production. A problem solver with scientific rigor, strong business acumen, and the ability to bridge the gap between data and decisions. Prior experience in leading cross-functional AI initiatives or collaborating with engineering teams to deploy ML pipelines. Bachelor's or master’s degree in computer science, Engineering, Statistics, or a related quantitative field. A PhD is a plus. Prior understanding in business domain of shipping and logistics is an advantage. Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Key responsibilities: Partner with business, product, and engineering teams to define problem statements, evaluate feasibility, and design AI/ML-driven solutions that deliver measurable business value. Lead and execute end-to-end AI/ML projects — from data exploration and model development to validation, deployment, and monitoring in production. Drive solution architecture using advanced techniques in machine learning, NLP, Generative AI, and statistical modeling. Champion the scalability, reproducibility, and sustainability of AI solutions by establishing best practices in model development, CI/CD, and performance tracking. Guide junior and associate AI/ML scientists through technical mentoring, code reviews, and solution reviews. Identify and evangelize the adoption of emerging tools, technologies, and methodologies across teams. Translate technical outputs into actionable insights for business stakeholders through storytelling, data visualizations, and stakeholder engagement. We are looking for: A seasoned AI/ML scientist with 7+ years of hands-on experience delivering enterprise-grade AI/ML solutions. Advanced proficiency in Python, SQL, PySpark, and experience working with cloud platforms (Azure preferred) and tools such as Databricks, Synapse, ADF, and Web Apps. Strong expertise in text analytics, NLP, and Generative AI, with real-world deployment exposure. Solid understanding of model evaluation, optimization, bias mitigation, and monitoring in production. A problem solver with scientific rigor, strong business acumen, and the ability to bridge the gap between data and decisions. Prior experience in leading cross-functional AI initiatives or collaborating with engineering teams to deploy ML pipelines. Bachelor's or master’s degree in computer science, Engineering, Statistics, or a related quantitative field. A PhD is a plus. Prior understanding in business domain of shipping and logistics is an advantage. Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 1 week ago

Apply

4.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Key responsibilities: Collaborate with business, platform and technology stakeholders to understand the scope of projects. Perform comprehensive exploratory data analysis at various levels of granularity of data to derive inferences for further solutioning/experimentation/evaluation. Design, develop and deploy robust enterprise AI solutions using Generative AI, NLP, machine learning, etc. Continuously focus on providing business value while ensuring technical sustainability. Promote and drive adoption of cutting-edge data science and AI practices within the team. Continuously stay up to date on relevant technologies and use this knowledge to push the team forward. We are looking for: A team player having 4-7 years of experience in the field of data science and AI. Proficiency with programming/querying languages like python, SQL, pyspark along with Azure cloud platform tools like databricks, ADF, synapse, web app, etc. An individual with strong work experience in areas of text analytics, NLP and Generative AI. A person with a scientific and analytical thinking mindset comfortable with brainstorming and ideation. A doer with deep interest in driving business outcomes through AI/ML. A candidate with bachelor’s or master’s degree in engineering, computer science with/withput a specialization within the field of AI/ML. A candidate with strong business acumen and desire to collaborate with business teams and help them by solving business problems. Prior understanding in business domain of shipping and logistics is an advantage. Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Requirements Description and Requirements Job Description and Requirements Position Summary The MetLife Corporate Technology (CT) organization is evolving to enable MetLife’s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. We are seeking a highly motivated and skilled Azure Data Engineer to join our growing team in Hyderabad. This position is perfect for talented professionals with 4-8 years of experience in designing, building, and maintaining scalable cloud-based data solutions. As an Azure Data Engineer at MetLife, you will collaborate with cross-functional teams to enable data transformation, analytics, and decision-making by leveraging Microsoft Azure’s advanced technologies. He/she should be a strategic thinker, an effective communicator, and an expert in technological development. Key Relationships Internal Stake Holder – Key Responsibilities Design, develop, and maintain efficient and scalable data pipelines using Azure Data Factory (ADF) for ETL/ELT processes. Build and optimize data models and data flows in Azure Synapse Analytics, SQL Databases, and Azure Data Lake. Work with large datasets to define, test, and implement data storage, transformation, and processing strategies using Azure-based services. Create and manage data pipelines for ingesting, processing, and transforming data from various sources into a structured format. Develop solutions for real-time and batch processing using tools like Azure Stream Analytics and Event Hubs. Implement data security, governance, and compliance measures to ensure the integrity and accessibility of the organization’s data assets. Contribute to the migration of on-premises databases and ETL processes to Azure cloud. Build processes to identify, monitor, and resolve data inconsistencies and quality issues. Collaborate with data architects, business analysts, and developers to deliver reliable and performant data solutions aligned with business requirements. Monitor and optimize performance and cost of Azure-based data solutions. Document architectures, data flows, pipelines, and implementations for future reference and knowledge sharing. Knowledge, Skills, And Abilities Education A Bachelors/master's degree in computer science or equivalent Engineering degree. Candidate Qualifications: Education: Bachelor's degree in computer science, Information Systems or related field Experience: Required: 4-8 years of experience in data engineering, with a strong focus on Azure-based services. Proficiency in Azure Data Factory (ADF), Azure Synapse Analytics, Azure Data Lake, and Azure SQL Databases. Strong knowledge of data modeling, ETL/ELT processes, and data pipeline design. Hands-on experience with Python, SQL, and Spark for data manipulation and transformation. Exposure to big data platforms like Hadoop, Databricks, or similar technologies. Experience with real-time data streaming using tools like Azure Stream Analytics, Event Hubs, or Service Bus. Familiarity with data governance, best practices, and security protocols within cloud environments. Solid understanding of Azure DevOps for CI/CD pipelines around data workflows. Strong problem-solving skills with attention to detail and a results-driven mindset. Excellent collaboration, communication, and interpersonal skills for working with cross-functional teams. Preferred: Demonstrated experience in end-to-end cloud data warehouse migrations. Familiarity with Power BI or other visualization tools for creating dashboards and reports. Certification in Azure Data Engineer Associate or Azure Solutions Architect is a plus. Understanding of machine learning concepts and integrating AI/ML pipelines is an advantage. Skills and Competencies: Language: Proficiency at business level in English. Competencies: Communication: Ability to influence and help communicate the organization’s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment Diverse environment: Can-do attitude and ability to work in a high paced environment Tech Stack Development & Delivery Methods: Agile (Scaled Agile Framework) DevOps and CI/CD: Azure DevOps Development Frameworks and Languages: SQL Spark Python Azure: Functional Knowledge of cloud based solutions About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us!

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Summary Position Summary Technical Lead – Big Data & Python skillset As a Technical Lead, you will be responsible as a strong full stack developer and individual contributor responsible to design application modules and deliver from the technical standpoint. High level of skills in coming up with high level design working with the architect and lead in module implementations technically. Must be a strong developer and ability to innovative. Should be a go to person on the assigned modules, applications/ projects and initiatives. Maintains appropriate certifications and applies respective skills on project engagements. Work you’ll do A unique opportunity to be a part of growing Delivery, methods & Tools team that drives consistency, quality, and efficiency of the services delivered to stakeholders. Responsibilities: Full stack hands on developer and strong individual contributor. Go-to person on the assigned projects. Able to understand and implement the project as per the proposed Architecture. Implements best Design Principles and Patterns. Understands and implements the security aspects of the application. Knows ADO and is familiar with using ADO. Obtains/maintains appropriate certifications and applies respective skills on project engagements. Leads or contributes significantly to Practice. Estimates and prioritizes Product Backlogs. Defines work items. Works on unit test automation. Recommend improvements to existing software programs as deemed necessary. Go-to person in the team for any technical issues. Conduct Peer Reviews Conducts Tech sessions within Team. Provides input to standards and guidelines. Implements best practices to enable consistency across all projects. Participate in the continuous improvement processes, as assigned. Mentors and coaches Juniors in the Team. Contributes to POCs. Supports the QA team with clarifications/ doubts. Takes ownership of the deployment, Tollgate, and deployment activities. Oversees the development of documentation. Participates in regular work, status communications and stakeholder updates. Supports development of intellectual capital. Contributes to knowledge network. Acts as a technical escalation point. Conducts sprint review. Does code Optimization and suggests team on the best practices. Skills: Education qualification : BE /B Tech ( IT/CS/Electronics) / MCA / MSc Computer science 6-9years ofIT experience in application development , support or maintenance activities 2+ years of experience in team management. Must have in-depth knowledge of software development lifecycles including agile development and testing. Enterprise Data Management framework , data security & Compliance( optional ). Data Ingestion, Storage n Transformation Data Auditing n Validation ( optional ) Data Visualization with Power BI ( optional ) Data Analytics systems ( optional ) Scaling and Handling large data sets. Designing & Building Data Services using At least 2+ years’ in : Azure SQL DB , SQL Wearhouse, ADF , Azure Storage, ADO CI/CD, Azure Synapse Data Model Design Data Entities : modeling and depiction. Metadata Mgmt( optional ). Database development patterns n practices : SQL / NoSQL ( Relation / Non-Relational – native JSON) , flexi schema, indexing practices, Master / child model data mgmt, Columnar , Row API / SDK for No SQL DBs Ops & Mgmt. Design and Implementation of Data warehouse, Azure Synapse, Data Lake, Delta lake Apace Spark Mgmt Programming Languages PySpark / Python , C#( optional ) API : Invoke / Request n Response PowerShell with Azure CLI ( optional ) Git with ADO Repo Mgmt, Branching Strategies Version control Mgmt Rebasing, filtering , cloning , merging Debugging & Perf Tuning n Optimization skills : Ability to analyze PySpark code, PL/SQL, . Enhancing response times GC Mgmt Debugging and Logging n Alerting techniques. Prior experience that demonstrates good business understanding is needed (experience in a professional services organization is a plus). Excellent written and verbal communications, organization, analytical, planning and leadership skills. Strong management, communication, technical and remote collaboration skill are a must. Experience in dealing with multiple projects and cross-functional teams, and ability to coordinate across teams in a large matrix organization environment. Ability to effectively conduct technical discussions directly with Project/Product management, and clients. Excellent team collaboration skills. Education & Experience: Education qualification: BE /B Tech ( IT/CS/Electronics) / MCA / MSc Computer science 6-9 years of Domain experience or other relevant industry experience. 2+ years of Product owner or Business Analyst or System Analysis experience. Minimum 3+ years of Software development experience in .NET projects. 3+ years of experiencing in Agile / scrum methodology Work timings: 9am-4pm, 7pm- 9pm Location: Hyderabad Experience: 6-9 yrs The team At Deloitte, Shared Services center improves overall efficiency and control while giving every business unit access to the company’s best and brightest resources. It is also lets business units focus on what really matters – satisfying customers and developing new products and services to sustain competitive advantage. A shared services center is a simple concept, but making it work is anything but easy. It involves consolidating and standardizing a wildly diverse collection of systems, processes, and functions. And if requires a high degree of cooperation among business units that generally are not accustomed to working together – with people who do not necessarily want to change. USI shared services team provides a wide array of services to the U.S. and it is constantly evaluating and expanding its portfolio. The shared services team provides call center support, Document Services support, financial processing and analysis support, Record management support, Ethics and compliance support and admin assistant support. How You’ll Grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities.We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. #CAP-PD Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300914

Posted 1 week ago

Apply

2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Job Description: Business Title QA Manager Years Of Experience 10+ Job Descreption The purpose of this role is to ensure the developed software meets the client requirements and the business’ quality standards within the project release cycle and established processes. To lead QA technical initiatives in order to optimize the test approach and tools. Must Have Skills At least 2 years in a lead role. Experience with Azure cloud. Testing file-based data lake solutions or Big data based solution. Worked on migration or implementation of Azure Data Factory projects. Strong experience in ETL/data pipeline testing, preferably with Azure Data Factory. Proficiency in SQL for data validation and test automation. Familiarity with Azure services: Data Lake, Synapse Analytics, Azure SQL, Key Vault, and Logic Apps. Experience with test management tools (e.g., Azure DevOps, JIRA, TestRail). Understanding of CI/CD pipelines and integration of QA in DevOps workflows. Experience with data quality frameworks (e.g., Great Expectations, Deequ). Knowledge of Python or PySpark for data testing automation. Exposure to Power BI or other BI tools for test result visualization. Azure Data Factory Exposure to Azure Databricks SQL/stored procedure on SQL Server ADLS Gen2 Exposure to Python/ Shell script Good To Have Skills Exposure to any ETL tool experience. Any other Cloud experience (AWS / GCP). Exposure to Spark architecture, including Spark Core, Spark SQL, DataFrame, Spark Streaming, and fault tolerance mechanisms. ISTQB or equivalent QA certification. Working experience on JIRA and Agile Experience with testing SOAP / API projects Stakeholder communication Microsoft Office Key responsibiltes Lead the QA strategy, planning, and execution for ADF-based data pipelines and workflows. Design and implement test plans, test cases, and test automation for data ingestion, transformation, and loading processes. Validate data accuracy, completeness, and integrity across source systems, staging, and target data stores (e.g., Azure SQL, Synapse, Data Lake). Collaborate with data engineers, architects, and business analysts to understand data flows and ensure test coverage. Develop and maintain automated data validation scripts using tools like PySpark, SQL, PowerShell, or Azure Data Factory Data Flows. Monitor and report on data quality metrics, defects, and test coverage. Ensure compliance with data governance, security, and privacy standards. Mentor junior QA team members and coordinate testing efforts across sprints. Education Qulification Minimum Bachelor’s degree in computer science, Information Systems, or related field. Certification If Any Any Basic level certification in AWS / AZURE / GCP Snowflake Associate / Core Shift timing 12 PM to 9 PM and / or 2 PM to 11 PM - IST time zone Location: DGS India - Mumbai - Goregaon Prism Tower Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 1 week ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

A.P. Moller - Maersk A.P. Moller – Maersk is the global leader in container shipping services. The business operates in 130 countries and employs 80,000 staff. An integrated container logistics company, Maersk aims to connect and simplify its customers’ supply chains. Today, we have more than 180 nationalities represented in our workforce across 131 Countries and this mean, we have elevated level of responsibility to continue to build inclusive workforce that is truly representative of our customers and their customers and our vendor partners too. We are responsible for moving 20 % of global trade & is on a mission to become the Global Integrator of Container Logistics. To achieve this, we are transforming into an industrial digital giant by combining our assets across air, land, ocean, and ports with our growing portfolio of digital assets to connect and simplify our customer’s supply chain through global end-to-end solutions, all the while rethinking the way we engage with customers and partners. Key Responsibilities: Partner with business, product, and engineering teams to define problem statements, evaluate feasibility, and design AI/ML-driven solutions that deliver measurable business value. Lead and execute end-to-end AI/ML projects — from data exploration and model development to validation, deployment, and monitoring in production. Drive solution architecture using techniques in data engineering, programming, machine learning, NLP, and Generative AI. Champion the scalability, reproducibility, and sustainability of AI solutions by establishing best practices in model development, CI/CD, and performance tracking. Guide junior and associate AI/ML engineers through technical mentoring, code reviews, and solution reviews. Identify and evangelize the adoption of emerging tools, technologies, and methodologies across teams. Translate technical outputs into actionable insights for business stakeholders through storytelling, data visualizations, and stakeholder engagement. We are looking for: A seasoned AI/ML engineer with 7+ years of hands-on experience delivering enterprise-grade AI/ML solutions. Advanced proficiency in Python, SQL, PySpark, and experience working with cloud platforms (Azure preferred) and tools such as Databricks, Synapse, ADF, and Web Apps. Strong expertise in applied text analytics, NLP, and Generative AI, with real-world deployment exposure. Solid understanding of model evaluation, optimization, bias mitigation, and monitoring in production. A problem solver with scientific rigor, strong business acumen, and the ability to bridge the gap between data and decisions. Prior experience in leading cross-functional AI initiatives or collaborating with engineering teams to deploy ML pipelines. Bachelor's or master’s degree in computer science, Engineering, Statistics, or a related quantitative field. A PhD is a plus. Prior understanding in business domain of shipping and logistics is an advantage. Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 1 week ago

Apply

4.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Key responsibilities: Collaborate with business, platform and technology stakeholders to understand the scope of projects. Perform comprehensive exploratory data analysis at various levels of granularity of data to derive inferences for further solutioning/experimentation/evaluation. Design, develop and deploy robust enterprise AI solutions using Generative AI, NLP, machine learning, etc. Continuously focus on providing business value while ensuring technical sustainability. Promote and drive adoption of cutting-edge data science and AI practices within the team. Continuously stay up to date on relevant technologies and use this knowledge to push the team forward. We are looking for: A team player having 4-7 years of experience in the field of data science and AI. Proficiency with programming/querying languages like python, SQL, pyspark along with Azure cloud platform tools like databricks, ADF, synapse, web app, etc. An individual with strong work experience in areas of text analytics, NLP and Generative AI. A person with a scientific and analytical thinking mindset comfortable with brainstorming and ideation. A doer with deep interest in driving business outcomes through AI/ML. A candidate with bachelor’s or master’s degree in engineering, computer science with/withput a specialization within the field of AI/ML. A candidate with strong business acumen and desire to collaborate with business teams and help them by solving business problems. Prior understanding in business domain of shipping and logistics is an advantage. Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Job Description: Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Responsibilities include: · Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery · Integrate and support third-party APIs and external services · Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack · Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) · Participate in Agile/Scrum ceremonies and manage tasks using Jira · Understand technical priorities, architectural dependencies, risks, and implementation challenges · Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability Primary Skills: 8+ years of hands-on development experience with: · C#, .NET Core 6/8+, Entity Framework / EF Core · JavaScript, jQuery, REST APIs · Expertise in MS SQL Server, including: · Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types · Skilled in unit testing with XUnit, MSTest · Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership · Strong problem-solving and debugging capabilities · Ability to write reusable, testable, and efficient code · Develop and maintain frameworks and shared libraries to support large-scale applications · Excellent technical documentation, communication, and leadership skills · Microservices and Service-Oriented Architecture (SOA) · Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: · Azure Functions · Azure Durable Functions · Azure Service Bus, Event Grid, Storage Queues · Blob Storage, Azure Key Vault, SQL Azure · Application Insights, Azure Monitoring Secondary Skills: · Familiarity with AngularJS, ReactJS, and other front-end frameworks · Experience with Azure API Management (APIM) · Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) • Experience with Azure Data Factory (ADF) and Logic Apps · Exposure to Application Support and operational monitoring · Azure DevOps - CI/CD pipelines (Classic / YAML) Qualification: Any UG / PG Degree / Engineering Graduates Experience: Minimum 8+ Years Gender: Male / Female Job Location: Trivandrum / Kochi (KERALA) Job Type: Full Time | Mid Shift | Sat & Sun Week Off Working Time: 12:01 PM to 9:00 PM Project: European client | Shift: Mid Shift (12:01PM TO 9:00PM) | WFO Salary: Rs.18,00,000 to 30,00,000 LPA Apply to hr@trueledge.com or info@trueledge.com

Posted 1 week ago

Apply

7.0 years

8 - 9 Lacs

Thiruvananthapuram

On-site

7 - 9 Years 4 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments: We are seeking a highly experienced Senior Data Engineer to design, develop, and optimize scalable data pipelines in a cloud-based environment. The ideal candidate will have deep expertise in PySpark, SQL, Azure Databricks, and experience with either AWS or GCP. A strong foundation in data warehousing, ELT/ETL processes, and dimensional modeling (Kimball/star schema) is essential for this role. Must-Have Skills 8+ years of hands-on experience in data engineering or big data development. Strong proficiency in PySpark and SQL for data transformation and pipeline development. Experience working in Azure Databricks or equivalent Spark-based cloud platforms. Practical knowledge of cloud data environments – Azure, AWS, or GCP. Solid understanding of data warehousing concepts, including Kimball methodology and star/snowflake schema design. Proven experience designing and maintaining ETL/ELT pipelines in production. Familiarity with version control (e.g., Git), CI/CD practices, and data pipeline orchestration tools (e.g., Airflow, Azure Data Factory Skills Azure Data Factory,Azure Databricks,Pyspark,Sql About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 1 week ago

Apply

2.0 - 5.0 years

3 - 9 Lacs

Gurgaon

On-site

Work Flexibility: Hybrid Senior Analyst, Analytics What will you do: Improve, and maintain Azure-based data warehouse solutions . Implement, monitor, and optimize workflows using Azure Synapse, ADF, and Databricks. Manage relationships with IT vendors to ensure optimal service delivery and performance. Offer the best practices, advice and recommendations to the Managed Services team around the overall architecture and strategy of Azure-based solutions. Act as the liaison between technical teams and business stakeholders to ensure effective service delivery. Collaborate with cloud architects and engineers to optimize cost, performance, and security. Assist with onboarding new Azure services and integrating them into existing operations. Investigate and resolve complex technical issues and bugs, ensuring the stability and reliability of the applications and data warehouse solutions. Operations Work closely with the IT Service Delivery Lead and support teams to manage daily support and maintenance of application instances and conduct long-term improvement operations to ensure compatibility with evolving mission requirements. What you need: Bachelor’s degree required; Master’s degree in computer science or Business Administration preferred 2 to 5 years of experience in Azure Platform (Synapse, ADF, Databricks, Power BI) Microsoft Azure Fundamentals or higher-level Azure certifications (e.g., AZ-104, AZ-305).Strong understanding of Azure services including Azure Virtual Machines, Azure Active Directory, Azure Monitor, and Azure Resource Manager. Experience in IT Service Management (ITSM), data analysis, and business process automation. Ability to develop good working relationships with technical, business, using strong communication and team-building skills. Ability to analyze numbers, trends, and data to make new conclusions based on findings. Ability to work effectively in a matrix organization structure, focusing on collaboration and influence rather than command and control. Travel Percentage: 10%

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies