Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 9.0 years
0 - 0 Lacs
hyderabad
On-site
We're seeking an experienced ETL Lead to lead our data integration and data warehousing efforts. As an ETL Lead, you'll design, develop, and maintain ETL processes and data pipelines using tools like Informatica, Talend, or AWS Glue. You'll also develop and maintain data models and data warehouses to support business intelligence and analytics needs. Strong leadership and communication skills are required to lead a team of ETL developers and collaborate with cross-functional teams. The ideal candidate will have 5+ years of experience in ETL development and data warehousing, with a strong technical background and excellent analytical and problem-solving skills. Experience with cloud-based ETL tools and Agile methodologies is a plus. We're offering a competitive salary and benefits package, opportunities for growth and development, and a collaborative work environment. If you're a motivated and experienced ETL professional looking for a leadership role, we'd love to hear from you!
Posted 19 hours ago
4.0 - 7.0 years
5 - 8 Lacs
Hyderābād
On-site
About NationsBenefits: At NationsBenefits, we are leading the transformation of the insurance industry by developing innovative benefits management solutions. We focus on modernizing complex back-office systems to create scalable, secure, and high-performing platforms that streamline operations for our clients. As part of our strategic growth, we are focused on platform modernization — transitioning legacy systems to modern, cloud-native architectures that support the scalability, reliability, and high performance of core back-office functions in the insurance domain. Position Overview: We are seeking a self-driven Data Engineer with 4–7 years of experience to build and optimize scalable ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. The role involves working across scrum teams to develop data solutions, ensure data governance with Unity Catalog, and support real-time and batch processing. Strong problem-solving skills, T-SQL expertise, and hands-on experience with Azure cloud tools are essential. Healthcare domain knowledge is a plus. Job Description: Work with different scrum teams to develop all the quality database programming requirements of the sprint. Experience in Azure cloud platforms like Advanced Python Programming, Databricks , Azure SQL , Data factory (ADF), Data Lake, Data storage, SSIS. Create and deploy scalable ETL/ELT pipelines with Azure Databricks by utilizing PySpark and SQL . Create Delta Lake tables with ACID transactions and schema evolution to support real-time and batch processing. Experience in Unity Catalog for centralized data governance, access control, and data lineage tracking. Independently analyse, solve, and correct issues in real time, providing problem resolution end-to-end. Develop unit tests to be able to test them automatically. Use SOLID development principles to maintain data integrity and cohesiveness. Interact with product owner and business representatives to determine and satisfy needs. Sense of ownership and pride in your performance and its impact on company's success. Critical thinker and problem-solving skills. Team player. Good time-management skills. Great interpersonal and communication skills. Mandatory Qualifications: 4-7 years of experience as a Data Engineer. Self-driven with minimal supervision. Proven experience with T-SQL programming, Azure Databricks, Spark (PySpark/Scala), Delta Lake, Unity Catalog, ADLS Gen2 Microsoft TFS, Visual Studio, Devops exposure. Experience with cloud platforms such as Azure or any. Analytical, problem-solving mindset. HealthCare domain knowledge Preferred Qualifications Healthcare Domain Knowledge
Posted 21 hours ago
5.0 years
1 - 3 Lacs
Hyderābād
On-site
Job Description Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.
Posted 21 hours ago
2.0 years
0 Lacs
Hyderābād
On-site
Job Description Overview Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Pipelines. You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Databricks and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 2-4 years of experience working as a Data Scientist. 2+ years’ experience building solutions in the commercial or in the supply chain space. 2+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 2+ years’ experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 2+ years’ experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Data Science – Hands on experience and strong knowledge of building machine learning models – supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills – Hands-on experience in statistical programming languages like Python, Pyspark and database query languages like SQL Statistics – Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) – Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics ‘product’ creation. Experience in Reinforcement Learning is a plus. Experience in Simulation and Optimization problems in any space is a plus. Experience with Bayesian methods is a plus. Experience with Causal inference is a plus. Experience with NLP is a plus. Experience with Responsible AI is a plus. Experience with distributed machine learning is a plus Experience in DevOps, hands-on experience with one or more cloud service providers AWS, GCP, Azure(preferred) Model deployment experience is a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is preferred Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills Stakeholder engagement-BU, Vendors. Experience building statistical models in the Retail or Supply chain space is a plus
Posted 21 hours ago
5.0 - 10.0 years
0 Lacs
Hyderābād
On-site
Job Description Overview DataOps L3 The role will leverage & enhance existing technologies in the area of data and analytics solutions like Power BI, Azure data engineering technologies, ADLS, ADB, Synapse, and other Azure services. The role will be responsible for developing and support IT products and solutions using these technologies and deploy them for business users Responsibilities 5 to 10 Years of IT & Azure Data engineering technologies experience Prior experience in ETL, data pipelines, data flow techniques using Azure Data Services Working experience in Python, Py Spark, Azure Data Factory, Azure Data Lake Gen2, Databricks, Azure Synapse and file formats like JSON & Parquet. Experience in creating ADF Pipelines to source and process data sets. Experience in creating Databricks notebooks to cleanse, transform and enrich data sets. Development experience in orchestration of pipelines Good understanding about SQL, Databases, Datawarehouse systems preferably Teradata Experience in deployment and monitoring techniques. Working experience with Azure DevOps CI/CD pipelines to deploy Azure resources. Experience in handling operations/Integration with source repository Must have good knowledge on Datawarehouse concepts and Datawarehouse modelling. Working knowledge of SNOW including resolving incidents, handling Change requests /Service requests, reporting on metrics to provide insights. Collaborate with the project team to understand tasks to model tables using data warehouse best practices and develop data pipelines to ensure the efficient delivery of data. Strong expertise in performance tuning and optimization of data processing systems. Proficient in Azure Data Factory, Azure Databricks, Azure SQL Database, and other Azure data services. Develop and enforce best practices for data management, including data governance and security. Work closely with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Proficient in implementing DataOps framework. Qualifications Azure data factory Azure Databricks Azure Synapse PySpark/SQL ADLS Azure DevOps with CI/CD implementation. Nice-to-Have Skill Sets: Business Intelligence tools (preferred—Power BI) DP-203 Certified.
Posted 21 hours ago
3.0 years
3 - 8 Lacs
Hyderābād
On-site
Job Description Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visit www.pepsico.com. PepsiCo Data Analytics & AI Overview: With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCo’s leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support “pre-engagement” activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science – Hands on experience and strong knowledge of building machine learning models – supervised and unsupervised models Programming Skills – Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics – Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud – Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills
Posted 21 hours ago
0 years
3 - 7 Lacs
Gurgaon
On-site
About the company SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, colour, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s in it for YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees. Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees. Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose The role is responsible for designing, developing, and enhancing intranet and enterprise portal solutions to improve collaboration, content management, and workflow automation. It requires hands-on expertise in Oracle WebCenter, ADF, Oracle BPM, and Microsoft PowerApps, while leading the integration and maintenance of internal platforms across the organization. Role Accountability Portal Development: Design and develop enterprise-grade intranet portals using Oracle WebCenter, ADF, and Microsoft PowerApps. Business Process Automation: Work with Oracle BPM to streamline and optimize business workflows. UI/UX Development: Develop intuitive and responsive interfaces using ADF components and PowerApps UI tools. Integration & Customization: Integrate portal solutions with enterprise applications like ERP, CRM, and legacy systems. Measures of Success Code Proficiency – Clean, maintainable development Timely Delivery – On-time task completion Skill Growth – Continuous learning mindset Team Collaboration – Cross-functional teamwork Code Quality – Testing and documentation Agile Discipline – DevOps and CI/CD Issue Resolution Technical Skills / Experience / Certifications WebCenter Expertise – Portal and content management Power Platform – PowerApps and automation ADF & BPM – Oracle development tools Full Stack – Java, SQL, JavaScript API Integration – REST/SOAP experience Security Protocols – OAuth, SAML, SSO Agile Delivery – CI/CD and sprints Troubleshooting Strength – Strong problem-solving Competencies critical to the role BPM Expertise – Business process automation WebCenter Proficiency – Portal and content tools Full Stack – Java, SQL, JavaScript ADF Skills – Oracle application development AI/ML Awareness – Intelligent process enhancement GenAI Familiarity – Generative AI applications Chatbot Integration – Conversational automation tools Automation Tools – End-to-end process improvement Qualification B.E, B. Tech, MCA Preferred Industry Software and Services
Posted 21 hours ago
5.0 - 12.0 years
0 Lacs
India
On-site
Job Description Title : Azure Integration & Development Experience : 5 - 12 Years Location- Mumbai/Pune/Bangalore/Hyderabad/Chennai/ Kolkata/ Noida Np - Immediate to 15 days max Job description Candidate will have extensive experience in designing developing and deploying solutions using various Azure services including Logic Apps Function Apps Azure Data Factory ADF Azure AML Azure Synapse Workflows and CICD pipelines This role requires a deep understanding of cloud computing concepts and the ability to collaborate with cross functional teams to deliver high quality solutions Key Responsibilities 1 Design and Develop Solutions o Create and implement cloud based applications using Azure Logic Apps and Function Apps o Develop and manage data pipelines using Azure Data Factory ADF to ensure efficient data integration and transformation o Design and automate workflows to streamline business processes and improve operational efficiency 2 CICD Pipeline Development o Design implement and maintain CICD pipelines for automated build test and deployment processes o Collaborate with development QA and operations teams to streamline software delivery and ensure seamless integration and deployment 3 Monitoring and Troubleshooting o Monitor and maintain Azure Logic Apps Function Apps and data pipelines to ensure optimal performance and reliability o Troubleshoot and resolve issues related to deployed applications and workflows 4 Collaboration and Documentation o Work closely with stakeholders to gather requirements and provide technical guidance o Document solutions and provide training to users and support teams o Stay updated with the latest Azure features and best practices Qualifications Bachelor’s degree in computer science information technology or a related field Proven experience in designing and implementing solutions using Azure Logic Apps Function Apps and Azure Data Factory Strong understanding of cloud computing concepts particularly within the Azure ecosystem Experience with CICD pipeline development and deployment using Azure DevOps Proficiency in programming languages such as C Python or JavaScript Excellent analytical and problem-solving skills Effective communication and collaboration skills Ability to manage multiple projects and prioritize tasks effectively Skills Azure Logic Apps Azure Function Apps Azure Data Factory ADF Azure AML Azure Synapse Workflow Automation CICD Pipelines Azure DevOps Troubleshooting and Technical Documentation
Posted 23 hours ago
0 years
0 Lacs
India
On-site
Dotnet Architect Job Summary: We are seeking a highly experienced and solution-oriented .NET Architect to lead the design and development of scalable, high-performance enterprise applications. The ideal candidate will have a strong background in .NET technologies, frontend frameworks like Angular, and a deep understanding of Azure cloud services. Core Responsibilities: Lead end-to-end architecture, design, and delivery of enterprise-grade .NET applications. Define technical standards, best practices, and architectural guidelines across projects. Drive design and implementation of cloud-native solutions using Azure services (App Services, Functions, Storage, Key Vault, etc.). Oversee integration between front-end (Angular) and backend (.NET Core/.NET 6+) components. Ensure security, scalability, and performance in cloud-hosted applications. Collaborate with cross-functional teams including developers, DevOps, QA, and stakeholders. Guide and mentor development teams on architecture and design patterns. Conduct code and design reviews to ensure high-quality delivery. Required Skills: Expertise in .NET Core / .NET 6+, C#, Web API, and Microservices architecture. Strong hands-on experience with Angular (8+). Proven experience designing and deploying solutions on Microsoft Azure. Knowledge of Azure services like App Services, Azure Functions, ADF, Key Vault, Service Bus, etc. Good understanding of DevOps practices and CI/CD pipelines using Azure DevOps. Solid understanding of design patterns, RESTful APIs, and security best practices. Excellent problem-solving and leadership skills.
Posted 23 hours ago
8.0 years
0 Lacs
India
Remote
Job Title: Senior DotNet Developer Experience: 8+ Years Location: Remote Contract Duration: Long Term Work Time: 12.00 PM to 9.00 PM Job Purpose Looking for candidates with 8+ years of experience in the IT industry, possessing strong skills in .Net/.Net Core, Azure Cloud Service, and Azure DevOps. This role involves direct client interaction, so strong communication skills are essential. The candidate must be hands-on in coding and Azure Cloud technologies. Work hours are 8 hours daily, with a mandatory 4-hour overlap with the EST Time zone (12 PM - 9 PM) to accommodate meetings. Responsibilities Design, develop, enhance, document, and maintain applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery Integrate and support third-party APIs and external services Collaborate with cross-functional teams to deliver scalable solutions across the full technology stack Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) Participate in Agile/Scrum ceremonies and manage tasks using Jira Understand technical priorities, architectural dependencies, risks, and implementation challenges Troubleshoot, debug, and optimize existing solutions with a focus on performance and reliability Primary Skills 8+ years of hands-on development experience with C#, .NET Core 6/8+, Entity Framework / EF Core Experience in JavaScript, jQuery, REST APIs Expertise in MS SQL Server including complex SQL queries, stored procedures, views, functions, packages, cursors, tables, and object types Unit testing experience using XUnit, MSTest Strong knowledge of software design patterns, system architecture, and scalable solution design Ability to lead and mentor teams through effective communication and technical ownership Strong problem-solving and debugging skills Ability to write reusable, testable, and efficient code Experience in developing and maintaining frameworks and shared libraries Strong technical documentation and leadership skills Experience with microservices and Service-Oriented Architecture (SOA) Hands-on experience in API integrations Minimum 2 years of experience with Azure Cloud Services including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Secondary Skills (Good to Have) Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure containerization and orchestration (AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to application support and operational monitoring Experience with Azure DevOps, CI/CD pipelines (Classic / YAML) Certifications Required (If Any) Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Developer Associate Other relevant certifications in Azure, .NET, or Cloud technologies
Posted 1 day ago
5.0 - 12.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Company: They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. Job Title: Business Analyst Location: PAN India Work Mode:Hybrid Experience: 5-12years ( 4years Relevant) Job Type: Contract to hire (C2H) Notice Period: - Immediate or up to 15 days joiners. Mandatory Skills: Business analyst, Cash Flow, P&L, TWC components (Creditors, Debtors, Inventory), and FCF (Free Cash Flow Additional skills: We are seeking a highly skilled and experienced Senior Business Analyst with deep domain expertise in Finance specifically in Total Working Capital management within large FMCG corporations This role is pivotal in driving the development of a comprehensive financial dashboard that visualizes key metrics and supports strategic decisionmaking across business units Key Responsibilities Requirements Gathering Analysis Collaborate with stakeholders to understand complex financial reporting needs Translate business requirements into functional specifications for dashboard development Maintain clear documentations of All requirements changes scope emails meetings etc Data Management Integration Design and oversee data pipelines pulling from multiple sources SAP SharePoint Legacy SFTP etc Ensure accurate transformation and harmonization of data across systems Azure Functions ADF ADB Etc
Posted 1 day ago
9.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Experience : 9+ years Architect experience with Azure or AWS and Databricks Notice : Immediate Preferred Job Location: Noida , Mumbai , Pune , Bangalore , Gurgaon , Kochi (Hybrid Work) Job Description: • Develop a detailed project plan outlining tasks, timelines, milestones, and dependencies. • Solutions architecture design and implementation • Understand the source and outline the ADF structure. Design and schedule packages using ADF. • Foster collaboration and communication within the team to ensure smooth workflow. • Application performance optimization. • Monitor and manage the allocation of resources to ensure tasks are adequately staffed. • Created detailed technical specification, business requirements and unit test report documents. • Ensure that the project adheres to best practices, coding standards, and technical requirements. • Collaborating with technical leads to address technical issues and mitigate risks.
Posted 1 day ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You will collaborate with business, platform, and technology stakeholders to understand the scope of projects. Your role will involve performing comprehensive exploratory data analysis at various levels of granularity to derive inferences for further solutioning, experimentation, and evaluation. You will design, develop, and deploy robust enterprise AI solutions using Generative AI, NLP, machine learning, and other relevant technologies. It will be essential to continuously focus on providing business value while ensuring technical sustainability. Additionally, you will promote and drive adoption of cutting-edge data science and AI practices within the team while staying up to date on relevant technologies to propel the team forward. We are seeking a team player with 4-7 years of experience in the field of data science and AI. The ideal candidate will have proficiency in programming/querying languages like Python, SQL, PySpark, and familiarity with Azure cloud platform tools such as Databricks, ADF, Synapse, Web App, among others. You should possess strong work experience in text analytics, NLP, and Generative AI, showcasing a scientific and analytical thinking mindset comfortable with brainstorming and ideation. A deep interest in driving business outcomes through AI/ML is crucial, alongside a bachelor's or master's degree in engineering or computer science with or without a specialization in the field of AI/ML. Strong business acumen and the desire to collaborate with business teams to solve problems are highly valued. Prior understanding of the business domain of shipping and logistics is considered advantageous. Should you require any adjustments during the application and hiring process, we are happy to support you. For special assistance or accommodations to use our website, apply for a position, or perform a job, please contact us at accommodationrequests@maersk.com.,
Posted 1 day ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description About the company SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, colour, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees. Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees. Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose The role is responsible for designing, developing, and enhancing intranet and enterprise portal solutions to improve collaboration, content management, and workflow automation. It requires hands-on expertise in Oracle WebCenter, ADF, Oracle BPM, and Microsoft PowerApps, while leading the integration and maintenance of internal platforms across the organization. Role Accountability Portal Development: Design and develop enterprise-grade intranet portals using Oracle WebCenter, ADF, and Microsoft PowerApps. Business Process Automation: Work with Oracle BPM to streamline and optimize business workflows. UI/UX Development: Develop intuitive and responsive interfaces using ADF components and PowerApps UI tools. Integration & Customization: Integrate portal solutions with enterprise applications like ERP, CRM, and legacy systems. Measures of Success Code Proficiency – Clean, maintainable development Timely Delivery – On-time task completion Skill Growth – Continuous learning mindset Team Collaboration – Cross-functional teamwork Code Quality – Testing and documentation Agile Discipline – DevOps and CI/CD Issue Resolution Technical Skills / Experience / Certifications WebCenter Expertise – Portal and content management Power Platform – PowerApps and automation ADF & BPM – Oracle development tools Full Stack – Java, SQL, JavaScript API Integration – REST/SOAP experience Security Protocols – OAuth, SAML, SSO Agile Delivery – CI/CD and sprints Troubleshooting Strength – Strong problem-solving Competencies critical to the role BPM Expertise – Business process automation WebCenter Proficiency – Portal and content tools Full Stack – Java, SQL, JavaScript ADF Skills – Oracle application development AI/ML Awareness – Intelligent process enhancement GenAI Familiarity – Generative AI applications Chatbot Integration – Conversational automation tools Automation Tools – End-to-end process improvement Qualification B.E, B. Tech, MCA Preferred Industry Software and Services
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be joining our dynamic team as an Azure Data Engineer - L3 with 5-7 years of experience, based in either Hyderabad or Bangalore, working the shift timings of 2PM-11PM IST. Your responsibilities will include: - Utilizing your expertise in Azure Data Factory, Databricks, Azure data lake, and Azure SQL Server. - Developing ETL/ELT processes using SSIS and/or Azure Data Factory. - Building complex pipelines and dataflows with Azure Data Factory. - Designing and implementing data pipelines using Azure Data Factory (ADF). - Enhancing the functionality and performance of existing data pipelines. - Fine-tuning processes dealing with very large data sets. - Configuring and Deploying ADF packages. - Proficient in the usage of ARM Template, Key Vault, Integration runtime. - Adaptable to working with ETL frameworks and standards. - Demonstrating strong analytical and troubleshooting skills to identify root causes and find solutions. - Proposing innovative and feasible solutions for business requirements. - Knowledge of Azure technologies/services such as Blob storage, ADLS, Logic Apps, Azure SQL, and Web Jobs. - Expertise in ServiceNow, Incidents, JIRA. - Exposure to agile methodology. - Proficiency in understanding and building PowerBI reports using the latest methodologies. Your key skills should include: - Azure - Azure Data Factory - Data bricks - Migration project experience Qualifications: - Engineer graduate Certifications: - Preferable: Azure certification, Data bricks Join us and be a part of our exciting journey as we continue to provide end-to-end solutions in various industry verticals with a global presence and a track record of successful project deliveries for Fortune 500 companies.,
Posted 1 day ago
4.0 - 7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-7 years of relevant work experience needed. Experience with stakeholder management will be an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.
Posted 1 day ago
4.0 - 7.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-7 years of relevant work experience needed. Experience with stakeholder management will be an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.
Posted 1 day ago
5.0 years
0 Lacs
India
On-site
What You'll Do Avalara is an AI-first company. We expect every employee to actively leverage AI to enhance productivity, quality, innovation, and customer value. AI is embedded in our workflows and products — and success at Avalara requires embracing AI as an essential capability, not an optional tool. We are looking for a experienced Oracle Cloud ERP Techno Functional Consultant to join our team. You have experience with Oracle Cloud ERP & Oracle EBS, specifically to Cash, Procure to Pay and Tax module. You have understanding of Core Oracle Technology, Oracle business processes, multiple integration tools and the ability to collaborate with partners. You will be reporting to the Senior Technical Lead. Responsibilities What Your Responsibilities Will Be Technical Expertise: programming skills in relevant technologies like Java, SQL, PL/SQL, XML, RESTful APIs, JavaScript, and ADF and web services. Develop custom solutions, extensions, and integrations to meet specific our requirements. Report and Analytics: Proficiency in creating custom reports, dashboards, and analytics using Oracle BI Publisher, Oracle OTBI (Oracle Transactional Business Intelligence), and other reporting tools. Experience reviewing code to find and address potential issues and defects hands-on experience in BI Publisher, OTBI and Data Models Data Integration and Migration: Experience in data integration between Oracle Fusion applications and other systems and data migration from legacy systems to Oracle Fusion. Knowledge of ETL (Extract, Transform, Load) tools. Customization and Extensions: customize and extend Oracle Fusion applications using tools like Oracle JDeveloper, Oracle ADF, and Oracle Application Composer to tailor the software to meet needs. Oracle Fusion Product Knowledge: Expertise in Oracle Fusion Financials, Oracle Fusion SCM (Supply Chain Management), Oracle Fusion Procurement and Oracle Fusion Tax. Security and Access Control: Knowledge of security models, user roles, and access controls within Oracle Fusion applications to ensure data integrity and compliance. Performance Tuning and Optimization: Skills in diagnosing and resolving performance issues, optimizing database queries, and ensuring a smooth operation of Oracle Fusion applications. Problem Troubleshooting: Experience approaching a problem from different angles, analyzing pros and cons of different solutions to identify and address technical issues, system errors, and integration challenges. Experience communicating updates and resolutions to customers and other partners to work with clients, gather requirements, explain technical solutions to non-technical partners, and collaborate with teams. What You’ll Need To Be Successful Qualifications Minimum 5+ years of experience as Oracle Cloud ERP Financials Minimum 5+ years of experience as Oracle EBS Financials Bachelor's degree in Computer Science, Information Technology, or a related field. Previous experience implementing Tax Modules in Oracle Cloud ERP and Oracle EBS- Experience and desire to work in a Global delivery environment Experience with latest integration methodologies. Proficiency in CI/CD tools (Jenkins, GitLab, etc.) How We’ll Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses. Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance. Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture. We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship. What You Need To Know About Avalara We’re defining the relationship between tax and tech. We’ve already built an industry-leading cloud compliance platform, processing over 54 billion customer API calls and over 6.6 million tax returns a year. Our growth is real - we're a billion dollar business - and we’re not slowing down until we’ve achieved our mission - to be part of every transaction in the world. We’re bright, innovative, and disruptive, like the orange we love to wear. It captures our quirky spirit and optimistic mindset. It shows off the culture we’ve designed, that empowers our people to win. We’ve been different from day one. Join us, and your career will be too. We’re An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company — we don’t want people to fit into our culture, but to enrich it. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law. If you require any reasonable adjustments during the recruitment process, please let us know.
Posted 1 day ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
The BI Data Engineer is a key role within the Enterprise Data team. We are looking for expert Azure data engineer with deep Data engineering, ADF Integration and database development experience. This is a unique opportunity to be involved in delivering leading-edge business analytics using the latest and greatest cutting-edge BI tools, such as cloud-based databases, self-service analytics and leading visualisation tools enabling the company’s aim to become a fully digital organisation. Job Description: Key Responsibilities: Build Enterprise data engineering and Integration solutions using the latest Azure platform, Azure Data Factory, Azure SQL Database, Azure Synapse and Azure Fabric Development of enterprise ETL and integration routines using ADF Evaluate emerging Data enginnering technologies, standards and capabilities Partner with business stakeholder, product managers, and data scientists to understand business objectives and translate them into technical solutions. Work with DevOps, engineering, and operations teams to implement CI/CD pipelines and ensure smooth deployment of data engiinering solutions Required Skills And Experience Technical Expertise : Expertise in the Azure platform including Azure Data Factory, Azure SQL Database, Azure Synapse and Azure Fabric Exposure to Data bricks and lakehouse arcchitect & technologies Extensive knowledge of data modeling, ETL processes and data warehouse design principles. Experienc in machine learning and AI services in Azure. Professional Experience : 5+ years of experience in database development using SQL 5+ Years integration and data engineering experience 5+ years experience using Azure SQL DB, ADF and Azure Synapse 2+ Years experience using Power BI Comprehensive understanding of data modelling Relevant certifications in data engineering, machine learning, AI. Key Competencies: Expertise in data engineering and database development. Familiarity with the Microsoft Fabric technologies including One Lake, Lakehouse and Data Factory Strong understanding of data governance, compliance, and security frameworks. Proven ability to drive innovation in data strategy and cloud solutions. A deep understanding of business intelligence workflows and the ability to align technical solutions Strong database design skills, including an understanding of both normalised form and dimensional form databases. In-depth knowledge and experience of data-warehousing strategies and techniques e.g., Kimball Data warehousing Experience in Cloud based data integration tools like Azure Data Factory Experience in Azure Dev Ops or JIRA is a plus Experience working with finance data is highly desirable Familiarity with agile development techniques and objectives Location: Pune Brand: Dentsu Time Type: Full time Contract Type: Permanent
Posted 1 day ago
2.0 - 3.0 years
0 Lacs
Telangana
On-site
Role : ML Engineer (Associate / Senior) Experience : 2-3 Years (Associate) 4-5 Years (Senior) Mandatory Skill: Python/ MLOps/ Docker and Kubernetes/FastAPI or Flask/CICD/Jenkins/Spark/SQL/RDB/Cosmos/Kafka/ADLS/API/Databricks Location: Bangalore Notice Period: less than 60 Days Job Description: Other Skills: Azure /LLMOps / ADF/ETL We are seeking a talented and passionate Machine Learning Engineer to join our team and play a pivotal role in developing and deploying cutting-edge machine learning solutions. You will work closely with other engineers and data scientists to bring machine learning models from proof-of-concept to production, ensuring they deliver real-world impact and solve critical business challenges. Collaborate with data scientists, model developers , software engineers, and other stakeholders to translate business needs into technical solutions. Experience of having deployed ML models to production Create high performance real-time inferencing APIs and batch inferencing pipelines to serve ML models to stakeholders. Integrate machine learning models seamlessly into existing production systems. Continuously monitor and evaluate model performance and retrain the models automatically or periodically Streamline existing ML pipelines to increase throughput. Identify and address security vulnerabilities in existing applications proactively. Design, develop, and implement machine learning models for preferably insurance related applications. Well versed with Azure ecosystem Knowledge of NLP and Generative AI techniques. Relevant experience will be a plus. Knowledge of machine learning algorithms and libraries (e.g., TensorFlow, PyTorch) will be a plus. Stay up-to-date on the latest advancements in machine learning and contribute to ongoing innovation within the team.
Posted 1 day ago
10.0 years
8 - 10 Lacs
Gurgaon
On-site
Additional Locations: India-Haryana, Gurgaon Diversity - Innovation - Caring - Global Collaboration - Winning Spirit - High Performance At Boston Scientific, we’ll give you the opportunity to harness all that’s within you by working in teams of diverse and high-performing employees, tackling some of the most important health industry challenges. With access to the latest tools, information and training, we’ll help you in advancing your skills and career. Here, you’ll be supported in progressing – whatever your ambitions. Senior Software Engineer-MLOps We are looking for a highly skilled Senior Software Engineer – MLOps with deep expertise in building and managing production-grade ML pipelines in AWS and Azure cloud environments. This role requires a strong foundation in software engineering, DevOps principles, and ML model lifecycle automation to enable reliable and scalable machine learning operations across the organization Key Responsibilities include: Design and build robust MLOps pipelines for model training, validation, deployment, and monitoring Automate workflows using CI/CD tools such as GitLab Actions, Azure DevOps, Jenkins, or Argo Workflows Build and manage ML workloads on AWS (SageMaker Unified studio, Bedrock, EKS, Lambda, S3, Athena) and Azure (Azure ML Foundry, AKS, ADF, Blob Storage) Design secure and cost-efficient ML architecture leveraging cloud-native services Manage infrastructure using IaC tools such as Terraform, Bicep, or CloudFormation Implement cost optimization and performance tuning for cloud workloads Package ML models using Docker, and orchestrate deployments with Kubernetes on EKS/AKS Ensure robust CI/CD pipelines and infrastructure as code (IaC) using tools like Terraform or CloudFormation Integrate observability tools for model performance, drift detection, and lineage tracking (e.g., Fiddler, MLflow, Prometheus, Grafana, Azure Monitor, CloudWatch) Ensure model reproducibility, versioning, and compliance with audit and regulatory requirements Collaborate with data scientists, software engineers, DevOps, and cloud architects to operationalize AI/ML use cases Mentor junior MLOps engineers and evangelize MLOps best practices across teams Required Qualification: Bachelor's/Master’s in Computer Science, Engineering, or related discipline 10 years in Devops, with 2+ years in MLOps. Proficient with MLflow, Airflow, FastAPI, Docker, Kubernetes, and Git. Experience with feature stores (e.g., Feast), model registries, and experiment tracking. Proficiency in Devops & MLOps, Automation Cloud formation/Teraform/BICEP Requisition ID: 610750 As a leader in medical science for more than 40 years, we are committed to solving the challenges that matter most – united by a deep caring for human life. Our mission to advance science for life is about transforming lives through innovative medical solutions that improve patient lives, create value for our customers, and support our employees and the communities in which we operate. Now more than ever, we have a responsibility to apply those values to everything we do – as a global business and as a global corporate citizen. So, choosing a career with Boston Scientific (NYSE: BSX) isn’t just business, it’s personal. And if you’re a natural problem-solver with the imagination, determination, and spirit to make a meaningful difference to people worldwide, we encourage you to apply and look forward to connecting with you!
Posted 1 day ago
7.0 years
8 - 10 Lacs
Gurgaon
On-site
Additional Locations: India-Haryana, Gurgaon Diversity - Innovation - Caring - Global Collaboration - Winning Spirit - High Performance At Boston Scientific, we’ll give you the opportunity to harness all that’s within you by working in teams of diverse and high-performing employees, tackling some of the most important health industry challenges. With access to the latest tools, information and training, we’ll help you in advancing your skills and career. Here, you’ll be supported in progressing – whatever your ambitions. Software Engineer-MLOps We are seeking an enthusiastic and detail-oriented MLOps Engineer to support the development, deployment, and monitoring of machine learning models in production environments. This is a hands-on role ideal for candidates looking to grow their skills at the intersection of data science, software engineering, and DevOps. You will work closely with senior MLOps engineers, data scientists, and software developers to build scalable, reliable, and automated ML workflows across cloud platforms like AWS and Azure Key Responsibilities include: Assist in building and maintaining ML pipelines for data preparation, training, testing, and deployment Support the automation of model lifecycle tasks including versioning, packaging, and monitoring Build and manage ML workloads on AWS (SageMaker Unified studio, Bedrock, EKS, Lambda, S3, Athena) and Azure (Azure ML Foundry, AKS, ADF, Blob Storage) Assist with containerizing ML models using Docker, and deploying using Kubernetes or cloud-native orchestrators Manage infrastructure using IaC tools such as Terraform, Bicep, or CloudFormation Participate in implementing CI/CD pipelines for ML workflows using GitHub Actions, Azure DevOps, or Jenkins Contribute to testing frameworks for ML models and data validation (e.g., pytest, Great Expectations). Ensure robust CI/CD pipelines and infrastructure as code (IaC) using tools like Terraform or CloudFormation Participate in diagnosing issues related to model accuracy, latency, or infrastructure bottlenecks Continuously improve knowledge of MLOps tools, ML frameworks, and cloud practices. Required Qualification: Bachelor's/Master’s in Computer Science, Engineering, or related discipline 7 years in Devops, with 2+ years in MLOps. Good Understanding of MLflow, Airflow, FastAPI, Docker, Kubernetes, and Git. Proficient in Python and familiar with bash scripting Exposure to MLOps platforms or tools such as SageMaker Studio, Azure ML, or GCP Vertex AI. Requisition ID: 610751 As a leader in medical science for more than 40 years, we are committed to solving the challenges that matter most – united by a deep caring for human life. Our mission to advance science for life is about transforming lives through innovative medical solutions that improve patient lives, create value for our customers, and support our employees and the communities in which we operate. Now more than ever, we have a responsibility to apply those values to everything we do – as a global business and as a global corporate citizen. So, choosing a career with Boston Scientific (NYSE: BSX) isn’t just business, it’s personal. And if you’re a natural problem-solver with the imagination, determination, and spirit to make a meaningful difference to people worldwide, we encourage you to apply and look forward to connecting with you!
Posted 1 day ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.
Posted 1 day ago
8.0 years
0 Lacs
India
Remote
Azure Data Engineer Location: Remote Shift : 6am - 3pm US central time zone Job Summary: We are seeking a highly skilled Data Engineer with strong experience in PostgreSQL and SQL Server, as well as hands-on expertise in Azure Data Factory (ADF) and Databricks. The ideal candidate will be responsible for building scalable data pipelines, optimizing database performance, and designing robust data models and schemas to support enterprise data initiatives. Key Responsibilities: Design and develop robust ETL/ELT pipelines using Azure Data Factory and Databricks Develop and optimize complex SQL queries and functions in PostgreSQL Develop and optimize complex SQL queries in SQL Server Perform performance tuning and query optimization for PostgreSQL Design and implement data models and schema structures aligned with business and analytical needs Collaborate with data architects, analysts, and business stakeholders to understand data requirements Ensure data quality, integrity, and security across all data platforms Monitor and troubleshoot data pipeline issues and implement proactive solutions Participate in code reviews, sprint planning, and agile ceremonies Required Skills & Qualifications: 8+ years of experience in data engineering or related field Strong expertise in PostgreSQL and SQL Server development, performance tuning, and schema design Experience in data migration from SQL Server to PostgreSQL Hands-on experience with Azure Data Factory (ADF) and Databricks Proficiency in SQL, Python, or Scala for data processing Experience with data modeling techniques (e.g., star/snowflake schemas, normalization) Familiarity with CI/CD pipelines, version control (Git), and agile methodologies Excellent problem-solving and communication skills If interested, share your resume on aditya.dhumal@leanitcorp.com
Posted 1 day ago
2.0 years
3 - 10 Lacs
India
Remote
Job Title - Sr. Data Engineer Experience - 2+ Years Location - Indpre (onsite) Industry - IT Job Type - Full ime Roles and Responsibilities- 1. Design and develop scalable data pipelines and workflows for data ingestion, transformation, and integration. 2. Build and maintain data storage systems, including data warehouses, data lakes, and relational databases. 3. Ensure data accuracy, integrity, and consistency through validation and quality assurance processes. 4. Collaborate with data scientists, analysts, and business teams to understand data needs and deliver tailored solutions. 5. Optimize database performance and manage large-scale datasets for efficient processing. 6. Leverage cloud platforms (AWS, Azure, or GCP) and big data technologies (Hadoop, Spark, Kafka) for building robust data solutions. 7. Automate and monitor data workflows using orchestration frameworks such as Apache Airflow. 8. Implement and enforce data governance policies to ensure compliance and data security. 9. Troubleshoot and resolve data-related issues to maintain seamless operations. 10. Stay updated on emerging tools, technologies, and trends in data engineering. Skills and Knowledge- 1. Core Skills: ● Proficient in Python (libraries: Pandas, NumPy) and SQL. ● Knowledge of data modeling techniques, including: ○ Entity-Relationship (ER) Diagrams ○ Dimensional Modeling ○ Data Normalization ● Familiarity with ETL processes and tools like: ○ Azure Data Factory (ADF) ○ SSIS (SQL Server Integration Services) 2. Cloud Expertise: ● AWS Services: Glue, Redshift, Lambda, EKS, RDS, Athena ● Azure Services: Databricks, Key Vault, ADLS Gen2, ADF, Azure SQL ● Snowflake 3. Big Data and Workflow Automation: ● Hands-on experience with big data technologies like Hadoop, Spark, and Kafka. ● Experience with workflow automation tools like Apache Airflow (or similar). Qualifications and Requirements- ● Education: ○ Bachelor’s degree (or equivalent) in Computer Science, Information Technology, Engineering, or a related field. ● Experience: ○ Freshers with strong understanding, internships and relevant academic projects are welcome. ○ 2+ years of experience working with Python, SQL, and data integration or visualization tools is preferred. ● Other Skills: ○ Strong communication skills, especially the ability to explain technical concepts to non-technical stakeholders. ○ Ability to work in a dynamic, research-oriented team with concurrent projects. Job Types: Full-time, Permanent Pay: ₹300,000.00 - ₹1,000,000.00 per year Benefits: Paid sick time Provident Fund Work from home Schedule: Day shift Monday to Friday Weekend availability Supplemental Pay: Performance bonus Ability to commute/relocate: Niranjanpur, Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Data Engineer: 2 years (Preferred) Work Location: In person Application Deadline: 31/08/2025
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.
Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai
The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.
In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.
Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?
As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough