Jobs
Interviews

4996 Data Governance Jobs - Page 43

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

10 - 15 Lacs

pune

Work from Office

We are looking for a highly skilled and experienced Data Engineer with over 5 years of experience to join our growing data team. The ideal candidate will be proficient in Databricks, Python, PySpark, and Azure, and have hands-on experience with Delta Live Tables. In this role, you will be responsible for developing, maintaining, and optimizing data pipelines and architectures to support advanced analytics and business intelligence initiatives. You will collaborate with cross-functional teams to build robust data infrastructure and enable data-driven decision-making. Key Responsibilities: .Design, develop, and manage scalable and efficient data pipelines using PySpark and Databricks .Build and optimize Spark jobs for processing large volumes of structured and unstructured data .Integrate data from multiple sources into data lakes and data warehouses on Azure cloud .Develop and manage Delta Live Tables for real-time and batch data processing .Collaborate with data scientists, analysts, and business teams to ensure data availability and quality .Ensure adherence to best practices in data governance, security, and compliance .Monitor, troubleshoot, and optimize data workflows and ETL processes .Maintain up-to-date technical documentation for data pipelines and infrastructure components Qualifications: 5+ years of hands-on experience in Databricks platform development. Proven expertise in Delta Lake and Delta Live Tables. Strong SQL and Python/Scala programming skills. Experience with cloud platforms such as Azure, AWS, or GCP (preferably Azure). Familiarity with data modeling and data warehousing concepts.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

mumbai, delhi / ncr, bengaluru

Work from Office

We are seeking a skilled Business Analyst with 46 years of experience, including at least 2 years in Azure Data Engineering projects, for a 6-month remote full-time role. The ideal candidate will work closely with stakeholders to gather and analyze business and technical requirements, collaborate with Azure Data Engineers, and support design decisions across data integration, transformation, and storage layers. Strong SQL skills, understanding of data governance, and experience in data platforms are essential. Excellent communication and stakeholder management skills are required. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote

Posted 2 weeks ago

Apply

5.0 - 7.0 years

15 - 18 Lacs

bengaluru

Work from Office

We are seeking a highly skilled GCP Data Engineer with experience in designing and developing data ingestion frameworks, real-time processing solutions, and data transformation frameworks using open-source tools. The role involves operationalizing open-source data-analytic tools for enterprise use, ensuring adherence to data governance policies, and performing root-cause analysis on data-related issues. The ideal candidate should have a strong understanding of cloud platforms, especially GCP, with hands-on expertise in tools such as Kafka, Apache Spark, Python, Hadoop, and Hive. Experience with data governance and DevOps practices, along with GCP certifications, is preferred.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

bengaluru, karnataka

Work from Office

We are seeking a highly skilled GCP Data Engineer with experience in designing and developing data ingestion frameworks, real-time processing solutions, and data transformation frameworks using open-source tools. The role involves operationalizing open-source data-analytic tools for enterprise use, ensuring adherence to data governance policies, and performing root-cause analysis on data-related issues. The ideal candidate should have a strong understanding of cloud platforms, especially GCP, with hands-on expertise in tools such as Kafka, Apache Spark, Python, Hadoop, and Hive. Experience with data governance and DevOps practices, along with GCP certifications, is preferred.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

12 - 15 Lacs

gurugram, ahmedabad

Work from Office

We are seeking a highly skilled GCP Data Engineer with experience in designing and developing data ingestion frameworks, real-time processing solutions, and data transformation frameworks using open-source tools. The role involves operationalizing open-source data-analytic tools for enterprise use, ensuring adherence to data governance policies, and performing root-cause analysis on data-related issues. The ideal candidate should have a strong understanding of cloud platforms, especially GCP, with hands-on expertise in tools such as Kafka, Apache Spark, Python, Hadoop, and Hive. Experience with data governance and DevOps practices, along with GCP certifications, is preferred.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

12 - 14 Lacs

mumbai, new delhi, bengaluru

Work from Office

We are seeking an SAP MDG Consultant with 8-10 years of experience in SAP MDG (Master Data Governance). The consultant should have a strong techno-functional background with expertise in MDG Data Model build, Business Partner, Finance, and MM domains. The role involves implementing MDG BRF+, managing mass changes, and understanding the Data Replication Framework. Knowledge of Data distribution using BD64, Partner Profiles, and RFCs is required. Exposure to ALE Idoc for Master Data and debugging skills in SAP (especially ABAP) is a big plus. The candidate should be comfortable with remote work and be willing to travel to Manila in January to run onboarding sessions for the new support team. Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 2 weeks ago

Apply

4.0 - 6.0 years

1 - 2 Lacs

mumbai, new delhi, bengaluru

Work from Office

Responsibilities: Design and implement scalable data pipelines to ingest, process, and analyze large volumes of structured and unstructured data from various sources. Develop and optimize data storage solutions, including data warehouses, data lakes, and NoSQL databases, to support efficient data retrieval and analysis. Implement data processing frameworks and tools such as Apache Hadoop, Spark, Kafka, and Flink to enable real-time and batch data processing. Collaborate with data scientists and analysts to understand data requirements and develop solutions that enable advanced analytics, machine learning, and reporting. Ensure data quality, integrity, and security by implementing best practices for data governance, metadata management, and data lineage. Monitor and troubleshoot data pipelines and infrastructure to ensure reliability, performance, and scalability. Develop and maintain ETL (Extract, Transform, Load) processes to integrate data from various sources and transform it into usable formats. Stay current with emerging technologies and trends in big data and cloud computing, and evaluate their applicability to enhance our data engineering capabilities. Document data architectures, pipelines, and processes to ensure clear communication and knowledge sharing across the team. Strong programming skills in Java, Python, or Scala.Strong understanding of data modelling, data warehousing, and ETL processes. Min 4 to Max 6yrs of Relevant exp.Strong understanding of Big Data technologies and their architectures, including Hadoop, Spark, and NoSQL databases. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 2 weeks ago

Apply

4.0 - 9.0 years

10 - 12 Lacs

bengaluru, doddakannell, karnataka

Work from Office

We are seeking a highly skilled Data Engineer with expertise in ETL techniques, programming, and big data technologies. The candidate will play a critical role in designing, developing, and maintaining robust data pipelines, ensuring data accuracy, consistency, and accessibility. This role involves collaboration with cross-functional teams to enrich and maintain a central data repository for advanced analytics and machine learning. The ideal candidate should have experience with cloud-based data platforms, data modeling, and data governance processes. Location - Bengaluru,Doddakannell, Karnataka, Sarjapur Road

Posted 2 weeks ago

Apply

5.0 - 7.0 years

11 - 16 Lacs

mumbai, pune, chennai

Work from Office

Were looking for a Power BI Engineer who can transform raw data into actionable insights through compelling dashboards and reports This role blends technical expertise with business acumen to support decisionmaking across departments Key Responsibilities Design, develop, and maintain interactive Power BI dashboards and reports Collaborate with stakeholders to gather business requirements and translate them into data models Perform data extraction, transformation, and loading (ETL) from various sources Optimize performance of reports and datasets for scalability and responsiveness Implement rowlevel security and manage user access within Power BI Ensure data accuracy and integrity through validation and testing Provide training and support to endusers on Power BI features and best practices Required Skills Proficiency in Power BI Desktop, Power BI Service, and DAX (Data Analysis Expressions) Strong knowledge of SQL and relational databases Experience with data modeling, data warehousing, and ETL processes Familiarity with Azure, SharePoint, or other Microsoft ecosystem tools Understanding of data governance, security, and compliance standards Bonus: Experience with Python, R, or other analytics tools Qualifications Bachelors degree in computer science, Data Analytics, or related field 5-7 years of experience in BI development or data analysis Microsoft Power BI certification is a plus Soft Skills Excellent communication and storytelling abilities Strong analytical thinking and attention to detail Ability to work independently and in crossfunctional teams

Posted 2 weeks ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

bengaluru

Work from Office

We are looking for a dynamic and results-driven marketing operations leader to manage and lead a team of marketing campaign operations specialists in India. This role demands a passionate professional who excels at leveraging data, optimizing processes, and driving operational success at scale. You will spearhead the efforts of the Global Campaign Operations team, ensuring the efficiency, expansion, and impact of our marketing technology and processes. As the leader of the India Marketing Operations team, you will focus on driving collaboration, strategy, and clear communication across regional campaign operations teams. Your leadership will play a key role in aligning global initiatives, identifying opportunities for process improvements, and fostering innovation to scale efforts effectively. The ideal candidate thrives in a fast-paced environment and is skilled at managing, motivating, and empowering teams. You must be a proactive problem solver who can independently navigate challenges while building strong relationships and promoting open communication. As Manager of Marketing Operations, you will translate complex business requirements from stakeholders into actionable solutions, delegating work strategically to your team in India and ensuring operational excellence. Program Process Collaboration and Optimization Define and implement best practices for global campaign operations and marketing operations support to promote efficiency and scalability. Collaborate with global stakeholders to support regional needs for Campaign Operations (COPs) and Marketing Operations (MOPs) work, providing expert MOPs guidance and solutions when needed. Partner with stakeholders on reporting needs, including creating custom reports, guiding the interpretation of data, and ensuring closed-loop processes for program data requests. Marketing Technology Lead the strategic planning and operational integration of tools and systems within the MarTech stack to drive innovation and streamline workflows. Manage production support for marketing automation systems, coordinating with internal and external technical teams to troubleshoot issues, recommend enhancements, and oversee ongoing improvement efforts. Conduct quality assurance (QA) checks and support for email campaign builds, including workflows, list management, template optimization, and troubleshooting via the MAP . Oversee the monitoring, auditing, and mitigation of integration issues within the MarTech stack, ensuring timely resolution and reporting. Support the global MOPs team with coding requests, including developing email templates, custom JavaScript for forms, and landing pages to enable technical execution of campaigns. Team Management and Support Manage and lead the India Marketing Operations (MOPs) team, ensuring work is delegated effectively, resources are utilized optimally, and the intake and collaboration process is continuously improved. Act as a mentor and coach to the team, fostering professional growth, collaboration, and alignment with global marketing operations initiatives. Team Leadershi p/ Partnership Identify , prioritize, and execute projects that enhance campaign performance and optimize departmental scalability, from gathering requirements with key stakeholders to technical delivery, analysis, and feedback. Collaborate with the Marketing Analytics team to build operational dashboards that track data health, program performance, and lead lifecycle insights. Ensure alignment across the Marketing Operations team globally, supporting onboarding, training, and enabling project execution while driving operational excellence. Maintain up-to-date and accessible documentation for all marketing operations processes, ensuring consistency and transparency across the organization. Experience: 8 + years Marketing Automation Experience Marketo Certification Expert Level Required Knowledge and Skills : You are passionate about mastering marketing technology ( MarTech ) tools, optimizing processes, and driving operational efficiencies. You excel at managing stakeholder relationships, driving cross-functional collaboration, and leveraging marketing technology to deliver scalable and efficient solutions. You thrive in bridging technical expertise with streamlined processes to meet organizational goals. Experience with Marketo, ON24 Webinars, BrightTalk , TechTarget, Integrate, CVENT, Google Ads, LinkedIn and Salesforce Proven ability to manage stakeholder requests and communication, ensuring alignment and timely delivery of solutions that meet global operational needs. Strong experience collaborating and coordinating across cross-functional, global teams to streamline workflows and drive organizational success. In-depth knowledge of marketing automation and campaign technologies, with expertise in Marketo administration, including nurture campaigns, lead scoring, integrations, workflows, webhooks, and APIs. Broad knowledge of MarTech tools and the ability to support their integration and operational build-out across large-scale marketing operations. Strong understanding of database management and data governance . Experience in optimizing end-to-end processes to enhance efficiency and scalability, balancing stakeholder priorities with technical feasibility. Skilled in managing tasks and projects with multiple stakeholders, ensuring successful completion through thoughtful planning and execution. Adept at creating and maintaining marketing automation workflows, building campaigns, and improving processes to enable marketing success. Demonstrated success in managing vendor relationships and ensuring MarTech solutions meet organizational requirements. Solid understanding of CRM systems, such as Salesforce, with the ability to align CRM workflows with marketing operations processes.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

30 - 45 Lacs

pune, chennai, bengaluru

Hybrid

Role & responsibilities Data Security & Governance Possess a good understanding of data classification models (PII, PHI, Financial Data etc) Understanding of data access governance Familiarity with Authentication/Authorization and Least Privilege Principle Good to have knowledge on compliance frameworks (SOX, ISO, HIPAA etc) Defining , Documenting , Enforcing & Managing access policies Snowflake and Databricks Expertise Proficiency in Snowflake, solid understanding of Role Hierarchies, Permissions, Data Sharing Hands on with SQL, Stored Procedures Experience in Databricks Lakehouse, Unity Catalog and Access Control Platform Skills Building dashboards for monitoring, data exposure and threats Solid understanding of out of box features, customization techniques of any tool Alerting & Monitoring Interpreting dashboards, Derive insights from behavior analytics to detect anomalies Scripting knowledge for automation (enforcing policies) DSPM tool like VARONIS will be an added advantage.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

18 - 22 Lacs

bengaluru

Work from Office

Meet the Team Join Cisco's cutting-edge Manufacturing Data & Analytics team, a group of innovators dedicated to transforming manufacturing operations through data-driven digital twin solutions. Our team collaborates closely with business architects, engineers, and manufacturing leaders to deliver actionable insights, process optimization, and measurable value across global supply chain and manufacturing environments. We are passionate about leveraging advanced analytics, integration, and simulation to drive operational excellence and power Cisco's digital transformation. Your Impact As a Data & Analytics Solution Architect, you will lead the architecture and implementation of digital twin solutions in manufacturing. Your work will enable real-time data integration, advanced simulations, and actionable analytics, supporting smarter, faster decision-making across operations. By defining and tracking digital twin KPIs, you will help maximize operational impact and deliver measurable business value. In this highly visible role, you will work directly with stakeholders and technical teams to ensure the success of Cisco's advanced manufacturing initiatives. In this role, you will: Coordinate with business architects and stakeholders to gather requirements for data, reporting, analytics, and digital twin solutions. Analyze business needs and design conceptual models for end-to-end BI and digital twin implementations. Translate requirements into technical specifications for digital twin and analytics initiatives. Engineer data pipelines and models integrating sensor, MES, and IoT data from manufacturing operations. Develop and maintain data marts and presentation views for digital twin outputs and business intelligence. Leverage business intelligence and visualization tools (Power BI, Tableau) to deliver insights from digital twin simulations. Generate analytics and reports to support operational decision-making and performance monitoring. Define, develop, and monitor digital twin KPIs (e.g., model accuracy, anomaly detection, downtime reduction, yield improvement, simulation cycle time, decision support utilization, integration coverage, cost savings, alert response time, and user adoption). Conduct reviews of digital twin KPIs, identify trends, and drive continuous improvement. Enforce industry standards for data architecture, modeling, and process documentation. Evaluate, troubleshoot, and improve digital twin models and BI assets. Manage the full development lifecycle, including change management and production migration. Support database architects and developers in building real-time and historical manufacturing analytics systems. Create and maintain technical specification documentation and end-user training materials. Key Responsibilities Gather, analyze, and document business and technical requirements for digital twin and BI solutions. Design and implement data integration, modeling, and presentation layers for manufacturing analytics. Develop actionable analytics and performance reports for semiconductor manufacturing. Define and track digital twin KPIs, driving operational improvement and value measurement. Ensure best practices in data architecture, modeling, and process documentation. Manage solution lifecycle, including testing, deployment, and ongoing enhancement. Deliver training and support for analytics tools and digital twin platforms. Minimum Qualifications 7-10 years of experience in data and business intelligence implementation, with 3-5 years focused on Supply Chain Operations, Digital Twin Implementation, or Manufacturing. Bachelor's degree in Data Science, Computer Science, Engineering, or related field. Proficiency in data analysis languages (Python, R, SQL) and visualization platforms (Power BI, Tableau). Experience with digital twin tools and concepts in a manufacturing environment. Strong understanding of semiconductor manufacturing process flows and equipment data. Preferred Qualifications Experience with Siemens Digital Industries (Tecnomatix, MindSphere), PTC ThingWorx, Dassault Systmes 3DEXPERIENCE, or Ansys Twin Builder. Experience developing and optimizing digital twin models in manufacturing. Strong communication skills and ability to work effectively in cross-functional teams. Demonstrated ability to translate business needs into technical solutions. Experience developing end-user training for analytics and digital twin platforms. Who You'll Work With Internal Teams: Business architects, manufacturing engineers, supply chain operations, database architects, and analytics developers. Stakeholders: Manufacturing leaders, process owners, IT, and data governance teams. External Partners: Technology vendors (digital twin and analytics platforms), solution integrators.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

11 - 16 Lacs

mumbai, pune, chennai

Work from Office

Were looking for a Power BI Engineer who can transform raw data into actionable insights through compelling dashboards and reports. This role blends technical expertise with business acumen to support decisionmaking across departments. Key Responsibilities Design, develop, and maintain interactive Power BI dashboards and reports. Collaborate with stakeholders to gather business requirements and translate them into data models. Perform data extraction, transformation, and loading (ETL) from various sources. Optimize performance of reports and datasets for scalability and responsiveness. Implement rowlevel security and manage user access within Power BI. Ensure data accuracy and integrity through validation and testing. Provide training and support to endusers on Power BI features and best practices. Required Skills Proficiency in Power BI Desktop, Power BI Service, and DAX (Data Analysis Expressions). Strong knowledge of SQL and relational databases. Experience with data modeling, data warehousing, and ETL processes. Familiarity with Azure, SharePoint, or other Microsoft ecosystem tools. Understanding of data governance, security, and compliance standards. Bonus: Experience with Python, R, or other analytics tools. Qualifications Bachelors degree in computer science, Data Analytics, or related field. 5-7 years of experience in BI development or data analysis. Microsoft Power BI certification is a plus. Soft Skills Excellent communication and storytelling abilities. Strong analytical thinking and attention to detail. Ability to work independently and in crossfunctional teams.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

13 - 18 Lacs

bengaluru

Work from Office

Educational Requirements Master Of Business Adm.,Master Of Commerce,Master Of Engineering,Master Of Technology,Master of Technology (Integrated),Bachelor Of Business Adm.,Bachelor Of Commerce,Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Technology (Integrated) Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional Requirements: At least 2 years of configuration & development experience with respect to implementation of OFSAA Solutions (such as ERM, EPM, etc.) Expertise in implementing OFSAA Technical areas covering OFSAAI and frameworks Data Integrator, Metadata Management, Data Modelling. Perform and understand data mapping from Source systems to OFSAA staging; Execution of OFSAA batches and analyses result area tables and derived entities. Perform data analysis using OFSAA Metadata (i.e Technical Metadata, Rule Metadata, Business Metadata) and identify if any data mapping gap and report them to stakeholders. Participate in requirements workshops, help in implementation of designed solution, testing (UT, SIT), coordinate user acceptance testing etc. Knowledge and experience with full SDLC lifecycle Experience with Lean / Agile development methodologies Preferred Skills: Technology->Oracle Industry Solutions->Oracle Financial Services Analytical Applications (OFSAA)

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

mumbai, pune, chennai

Work from Office

1. Background The Client(the CLIENT or the Bank) is progressing a programme of work to deliver business and operational efficiencies to become a leaner and more effective organisation. Ongoing Programmes of work include improvements to business processes, generating staff efficiencies through process improvements, IT transformation and embedding agile delivery methodologies. The Bank needs a consultant to help with a programme based at CLIENTs London HQ, requiring a Technical Lead for a key digitalisation project in the Environmental and Sustainability department. This involves implementing Microsoft Power Platform technology using Agile methodologies and DevOps practices. 2. Objectives of the consulting services The consultant services as a technical lead will be to contribute to the development of the internal power platform using a partner-led development team to business requirements. The consultant will work in CLIENT IT to put in place sustainable development practices, and industry-standard processes, and bring knowledge of IT development practices. The Power Platform Tech Lead oversees the design, development, implementation, and maintenance of Power Platform solutions within CLIENT's IT department. They manage a team of offshore/nearshore developers and technical experts to effectively use Power Platform applications and technologies to achieve business goals. Typical responsibilities of this role include: Technical Architecture and Solution Design : a. Define the technical architecture for Power Platform solutions to ensure scalability, flexibility, and alignment with business requirements. b. Design and plan end-to-end solutions that integrate Power Platform components effectively. Customization and Development : a. Oversee customization and development efforts, including custom plugins, workflows, model-driven apps, canvas apps, and more within Power Platform. b. Ensure adherence to best practices and coding standards in custom development. Integration and Data Management : a. Define integration guidelines to seamlessly connect Power Platform with other systems, ensuring data consistency and integrity. b. Guide the integration of data and processes between Power Platform solutions and other internal applications Power Automate and Power Apps : a. Guide the development of Power Automate workflows and Power Apps solutions to automate business processes and enhance user experiences. b. Optimize the use of connectors and services within Power Automate and Power Apps to achieve desired outcomes. Leadership : a. Lead technical teams, providing guidance, mentorship, and support throughout the project lifecycle. b. Coordinate with project managers and stakeholders to plan, organize, and prioritise development efforts based on project requirements and timelines. Documentation and Knowledge Sharing: a. Ensure comprehensive documentation of technical designs, customizations, integrations, and configurations for future reference and knowledge sharing. b. Conduct training sessions and workshops to educate the team and stakeholders on Power Platform, and related technologies. Communication and Stakeholder Engagement : a. Communicate technical concepts and solutions effectively to both technical and non-technical stakeholders. b. Collaborate with stakeholders to gather technical requirements, provide updates, and align technical solutions with business needs. Security and Compliance: a. Ensure Power Platform implementations adhere to security best practices and comply with relevant regulations and standards. b. Address security considerations and potential vulnerabilities in the solutions developed. Deployment and Maintenance : Oversee the deployment of Power Platform solutions, ensuring smooth transitions and minimal disruption to operations. Manage post-deployment maintenance and updates, addressing any issues, bugs, or performance optimizations. Requirements Analysis and Scoping : Collaborate with business analysts and stakeholders to analyse business requirements and define the scope of projects, ensuring a clear understanding of objectives and deliverables. 3. Scope of Services The consultant will provide services as part of an CLIENT ecosystem of multiple third parties, staff, and consultants. 4. Implementation Arrangements The Consultant will work closely with the offshore Software Development Manager, receiving further technical guidance from the Associate Director-Capability Lead, Enterprise IT Architect, and Software Delivery Manager. The Bank will not attempt to direct, control, or oversee the Consultants working methods in delivering services. Within the scope of their duties, the Consultant will follow the relevant Bank policies, procedures, and standards. 5. Deliverables The Consultant will contribute to the following activities, and work across the project team and the Bank to ensure tasks and goals are delivered on time and adhere to defined quality standards Contribute to drawing up specific technical proposals using the Power platform for meeting business requirements in the backlog Work closely with IT disciplines and business stakeholders to transpose requirements into technical designs and specifications, aligning with the overall project and the Banks business demand roadmaps and architecture governance Estimate and plan with your scrum team of developers, Business Analyst (BA), Quality Assurance (QA), Product Owner and IT Architect Technical Leadership: Provide technical leadership and guidance to a team of developers working on Power Platform projects. Power Platform Development: Design, develop, and implement solutions using Power Apps, Power Automate, and ensuring high-quality code and adherence to best practices. Solution Architecture: Collaborate with architects and business analysts to design a solution that is scalable, secure, and efficient while adhering to licensing requirements to provide the most suitable cost-effective way. Team Collaboration: Work closely with cross-functional teams, including business analysts, developers, and stakeholders, to ensure successful project delivery and alignment with business objectives. Quality Assurance: Ensure the quality and performance of Power Platform solutions through code reviews, dev testing, and optimization. Troubleshooting and Issue Resolution: Identify and resolve technical issues, troubleshoot problems, and provide timely resolutions to ensure smooth project execution and troubleshooting of operational issues found in production and tracing it backward to root cause and work with the development team to provide a fix back into production Training and Mentoring: Provide technical guidance and mentorship to junior developers, fostering a culture of continuous learning and professional growth. Delivering technical directions for continuous integration and continuous deployment as appropriate Lead investigation, capture, and prioritisation and resolve complex Business IT problems using a variety of technical analysis techniques and distributing work amongst other team members 5.1 Essential Power Platform (Canvas/model-driven apps, SharePoint), at least 5+ years of experience Experience in Dataverse implementation, data modeling, and data governance of setup within Power Platform solutions with about 2-3 years of experience Developing services through the use of and supporting web APIs (SOAP/REST),JSON and API management for at least 3 years Understanding of and ability to apply modern development methodologies such as BDD, TDD, XP, , etc. to support our Agile delivery practices Experience in developing component designs and specifications from a high-level solution architecture Evangelise Agile principles to help CLIENT and development teams establish and continuously improve working practices Experience in delivering software in a formal regime using source control management and control gates with relevant artifacts to support design, quality, and support. Experience working in a mid-sized corporate environment, successfully aligning solutions appropriately with wider roadmaps, architecture, and other initiatives Ability to extend Power Platform capabilities with custom development in .NET, JavaScript, or other modern programming skills where necessary Customization and Configuration: Ability to customize and configure power platform based on business requirements, including forms, entities, workflows, and business rules. Integration: Knowledge of integration techniques and technologies to integrate power platforms with other enterprise systems and third-party applications. Familiarity with working with APIs and web services for integrating external systems with Power Platform. Azure Integration: Understanding of Azure services and how to utilize them for power platform integration, data storage, and other related purposes. Power Platform: Proficiency in using and configuring Power Apps (model-driven and Canvas), and Power Automate. Security and Authentication: Knowledge of security principles and authentication mechanisms in Power Platform to ensure data security and access control. Performance Optimization: Skills in optimising power platform solutions for performance, including identifying and addressing performance bottlenecks. Version Control: Experience using version control systems such as GitHub / Jenkins for managing code and collaborating with a development team. Troubleshooting and Debugging: Strong troubleshooting and debugging skills to identify and resolve technical issues in Power Platform solutions. Testing Frameworks: Familiarity with testing frameworks and methodologies to ensure the quality and reliability of Power Platform applications. DevOps Practices: Knowledge of DevOps practices, CI/CD pipelines, and automated deployment processes for Power Platform solutions. Technical Documentation: Ability to create and maintain detailed technical documentation for Power Platform solutions, including low-level design architecture, design, and code documentation. 6.3 Track Record Working on Agile software delivery teams, following an iterative approach to deliver working software using Power Platform 3 to 5+ years experience and overall, with technical lead experience of 8+ years Led large-scale technical implementation of power platform solutions for alevel enterprise organisation with digitalising data Hands-on Experience: who have built, customized, and deployed solutions across the Power Platform, including Power Apps, Power Automate, Power BI 6.4 Education & Qualifications Educated to degree level or equivalent, with a qualification in an IT, engineering, or scientific discipline, desired. Any certification: Power Platform Developer Associate or Power Platform App Maker Associate, Microsoft Certified Power Platform Solution Architect.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

bengaluru

Work from Office

Responsibilities As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems . If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Job Opening is for multiple locations: BANGALORE, BHUBANESWAR, MYSORE, HYD, CHENNAI, PUNE, COIMBATORE, THIRUVANANTHAPURAMPlease apply only if you have skills mentioned under technical requirement Technical and Professional Requirements: Skills Required : Strong understanding of MDM concepts, data governance, and data stewardship. Proficiency in SQL and data validation techniques. Experience with one or more MDM platforms (Informatica, Reltio, SAP MDG, etc.). Familiarity with data modeling, data profiling, and data quality tools. Knowledge of integration technologies (ETL, APIs, messaging systems). Preferred Skills: Technology->Data Services Testing->MDM Testing Educational Requirements Bachelor of Engineering Service Line Infosys Quality Engineering

Posted 2 weeks ago

Apply

8.0 - 13.0 years

12 - 16 Lacs

pune

Work from Office

Job Overview This position is responsible for the design and implementation of scalable data management practices and stewards hip protocols across core GTM and other operational data domains (accounts, contacts, opportunities, activities, etc.) This role requires both technical proficiency and strong leadership skills to drive business decisions through data. About Us When you join iCIMS, you join the team helping global companies transform business and the world through the power of talent. Our customers do amazing things: design rocket ships, create vaccines, deliver consumer goods globally, overnight, with a smile. As the Talent Cloud company, we empower these organizations to attract, engage, hire, and advance the right talent. Were passionate about helping companies build a diverse, winning workforce and about building our home team. We're dedicated to fostering an inclusive, purpose-driven, and innovative work environment where everyone belongs. Responsibilities Establish and Operationalize GTM Data Standards - Define and maintain critical data elements, ownership models, and classification structures aligned with operational and compliance needs. Partner with Legal, Compliance, and InfoSec to ensure alignment with overall corporate/ enterprisegovernanceprograms andpolicies. This role also plays a key part in contributing to enterprise-wide data governanceensuring GTM-level practices are aligned with broader corporate policies and grounded in real-world operational use. Execute Data Quality Strategy - Develop and manage a data quality framework to monitor completeness, consistency, timeliness, and accuracy of GTM and other operationaldata. Build dashboards and KPIs to track performance, surface issues, and support root cause investigations. Collaborate with RevOps, Product, and Engineering on remediation plans that reduce friction in pipeline, forecasting, and engagement flows. Advance Metadata and Lineage Visibility - Support development of a business glossary, metadata repository, and lineage mapping to ensure transparency across data workflows and enable better change impact analysis. Drive consistency and adoption of metadata standards across teams. Apply Agentic and Generative AI to Stewardship Workflows - Pilot AI solutions to automate repetitive tasks in data stewardship, including anomaly detection, issue triage, and documentation generation. Collaboratewith stakeholders to identify where AI can enhance efficiency while maintaining control and compliance (like the EU AI Act). Support Risk Mitigation and Audit Readiness - Help maintain audit-ready documentation of data practices and control activities. Monitor risk exposure across GTM systems and SaaS tools, and support remediation with control owners. Ensure data retention, masking, and deletion policies are consistently enforced. Enable Cross-Functional Alignment on Data Standards - Work closely with Sales, Marketing, Product, RevOps, and Engineering teams to embed data practices into operational workflows and system designs.Support working groups by providing updates, surfacing gaps, and tracking adoption progress. Promote Data Literacy and Stewardship - Develop and deliver training, documentation, and support materials to help data users, analysts, and stewards understand data responsibilities and best practices. Drive a culture of transparency, accountability, and responsible data usage. Qualifications 8+ years of experience in data quality, data management, or GTM data operations Experience implementing structured data practices across enterprise systems Proficiency with SQL, data profiling, and tooling (e.g., Alation, Collibra, Informatica) Familiarity with privacy and compliance regulations (e.g., GDPR, CCPA, SOC 2, & ISO) Experience with metadata management and data lineage practices Exposure to agentic or generative AI in data stewardship or operations Strong communication skills and ability to influence across technical and business teams Demonstrated program management expertiseincluding creating project plans, organizing and leading stakeholder meetings,and providing status updates on key milestones

Posted 2 weeks ago

Apply

10.0 - 15.0 years

22 - 27 Lacs

mumbai, pune, chennai

Work from Office

Primary skillset:Technical Architect /Lead | Azure, Databricks, Apache Spark Secondary skillset:SQL,Python,Datawarehouse Job Description: Design scalable, secure, and high-performance cloud solutions on Microsoft Azure services Design large-scale data processing and analytics solutions using Azure Databricks. Develop comprehensive architecture blueprints and technical documentation for data solutions. Lead the implementation of Databricks-based solutions, ensuring alignment with best practices and business requirements. Lead Data Mesh/Data products architecture building in Databricks-based solutions Provide technical leadership and guidance throughout the project lifecycle. Lead the implementation of Azure-based solutions, ensuring alignment with best practices and business requirements. Collaborate with stakeholders to gather requirements and translate them into technical solutions. Design and implement CI/CD pipelines using Azure DevOps. Ensure robust security measures are in place, including identity management, encryption, and network security. Optimize cloud resources for performance, cost, and reliability. Stay updated with the latest Azure services, features, and industry trends. Requirements: Bachelors degree in computer science, Engineering, or a related field. 10-15 years of experience in data engineering or a related field. Extensive experience with Databricks and Apache Spark. Proficiency in programming languages such as Python Strong knowledge of SQL and experience with relational databases. Experience with cloud platforms Especially in Azure Strong with data warehousing solutions (e.g., Delta Lake, Lakehouse) Understanding of data governance and security best practices. Excellent problem-solving skills and attention to detail. Strong communication and teamwork skills.

Posted 2 weeks ago

Apply

0.0 - 3.0 years

0 Lacs

hyderabad, telangana

On-site

ANSR is hiring for one of its client. About ArcelorMittal: ArcelorMittal, formed in 2006 from the merger of European company Arcelor and Indian-owned Mittal Steel, has established itself as the world's leading steel and mining company with operations in over 60 countries and a strong industrial presence in 18 nations. As a global team of 158,000+ talented individuals, we are dedicated to creating a better world through the production of smarter low-carbon steel. Our strategies prioritize innovation and sustainability, supplying various global markets with our products, from automotive and construction to household appliances and packaging, supported by cutting-edge R&D and distribution networks. ArcelorMittal Global Business and Technologies in India serves as a hub for technological innovation and business solutions. Our community of business professionals and technologists collaborates to bring diverse perspectives and experiences to revolutionize the global steel manufacturing industry. This environment fosters groundbreaking ideas and sustainable business growth opportunities. We cultivate a culture driven by entrepreneurship and excellence, focusing on the growth and development of our team members. With flexible career paths, access to the latest technology, and a space for learning and ownership, we offer an environment where you can tackle exciting challenges every day. Position Summary: ArcelorMittal is looking for a detail-oriented and technically skilled Visualization Maintenance Specialist to manage and maintain business intelligence solutions developed in Power BI and Tableau. In this role, you will be responsible for ensuring the reliability, performance, and accuracy of dashboards and reports essential for critical business decision-making. Collaborating with business users, BI developers, and data platform teams, you will support visualizations across departments and uphold data availability, security, and version control. Key Responsibilities: **Dashboard Maintenance & Monitoring:** - Monitor scheduled data refreshes and report performance in Power BI and Tableau environments. - Troubleshoot and resolve issues related to dashboards, broken visuals, slow performance, and connectivity errors. - Manage report version control and ensure consistent updates across different environments. - Conduct QA testing of dashboards post data model updates or platform changes. **Data Source & Integration Support:** - Validate data source connections to ensure the stability and reliability of reports. - Collaborate with data engineering teams to align upstream data pipelines with reporting needs. - Maintain and troubleshoot extracts, published data sources, and shared datasets. **User Support & Access Management:** - Provide end-user support for visualization tools, including issue resolution and dashboard navigation. - Manage user permissions, access rights, and role-based security in Tableau and Power BI workspaces. - Act as a bridge between end-users and technical teams for reporting requirements and issue escalations. **Documentation & Best Practices:** - Maintain documentation of dashboard dependencies, data sources, and business logic. - Enforce governance by applying naming conventions, metadata standards, and usage guidelines for dashboards and reports. Required Qualifications: - Bachelor's degree in Information Systems, Business Intelligence, or a related field. - 0 to 3 years of experience in a BI support or visualization maintenance role. - Proficiency in Power BI (including Power BI Service) and Tableau (Desktop and Server/Cloud). - Strong understanding of data modeling, data refresh schedules, and query performance optimization. - Familiarity with SQL and data integration tools (e.g., Azure Data Factory, SAP BW connections). Preferred Qualifications: - Experience with enterprise-scale reporting environments and self-service BI frameworks. - Knowledge of scripting languages like DAX (for Power BI) and Tableau Calculated Fields. - Understanding of role-level security and data governance policies. - Experience collaborating with cross-functional data teams (data engineers, analysts, and business users). What We Offer: - A key role in promoting data-driven decision-making across the organization. - Access to industry-leading BI tools and modern data platforms. - Competitive compensation, training, and development opportunities. - Collaborative team environment with cross-department visibility.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

You will be joining TekUncorked as a Data Management Specialist based in Gurgaon. Your primary responsibilities will include managing data collection, ensuring data quality, and implementing data security protocols. Your key duties will involve establishing data governance policies, designing and optimizing databases, creating backup and recovery plans, implementing security measures, and using validation tools to maintain data accuracy and consistency. You will also conduct data audits, monitor data handling procedures, and provide training to employees. Collaboration with various stakeholders to ensure data management practices align with organizational goals and IP security requirements will be an essential part of your role. To qualify for this position, you should have expertise in data governance, data quality, and data management. Additionally, familiarity with IT infrastructure, networking, security protocols, strong analytical skills, experience in master data management, attention to detail, proficiency in data analysis tools, and a Bachelor's degree in Data Management, Computer Science, or a related field are required. A minimum of 7-9 years of relevant experience in a product company is also expected.,

Posted 2 weeks ago

Apply

4.0 - 12.0 years

0 Lacs

coimbatore, tamil nadu

On-site

We are looking for a Data Architect with a minimum of 12 years of experience, including at least 4 years in Azure data services. As a Data Architect, you will be responsible for leading the design and implementation of large-scale data solutions on Microsoft Azure. Your role will involve leveraging your expertise in cloud data architecture, data engineering, and governance to create robust, secure, and scalable platforms. Key Skills: - Proficiency in Azure Data Factory, Synapse, Databricks, and Blob Storage - Strong background in Data Modeling and Lakehouse Architecture - Experience with SQL, Python, and Spark - Knowledge of Data Governance, Security, and Metadata Management - Familiarity with CI/CD practices, Infra as Code (ARM/Bicep/Terraform), and Git - Excellent communication skills and the ability to collaborate effectively with stakeholders Bonus Points For: - Azure Certifications such as Data Engineer or Architect - Hands-on experience with Event Hubs, Stream Analytics, and Kafka - Understanding of Microsoft Purview - Industry experience in healthcare, finance, or retail sectors Join our team to drive innovation through data, shape architecture strategies, and work with cutting-edge Azure technologies. If you are ready to make a significant impact in the field of data architecture, apply now or reach out to us for more information. For more information, please contact karthicc@nallas.com.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

maharashtra

On-site

As a Data Engineer, you will need to have 5 to 10 years of experience in software/data engineering. Your proficiency should include working with SQL, NoSQL databases such as DynamoDB and MongoDB, ETL tools, and data warehousing solutions. It is essential to have expertise in Python and familiarity with cloud platforms like Azure, AWS (e.g., EC2, S3, RDS) or GCP. Your role will also involve using data visualization tools like Tableau, Power BI, or Looker. Knowledge of data governance and security practices is crucial for maintaining data integrity. Experience with DevOps practices, including CI/CD pipelines and containerization (Docker, Kubernetes), will be beneficial. Effective communication skills in English, both verbal and written, are required for collaborating with team members and stakeholders. Working knowledge of Agile methodologies is necessary as you will be operating in Agile development environments. Additionally, having an understanding of AI and ML concepts, frameworks like TensorFlow and PyTorch, and practical applications is expected. Familiarity with Generative AI technologies and their potential use cases will be an advantage. This full-time position is based in IN-GJ-Ahmedabad, India-Ognaj (eInfochips) and falls under the Engineering Services category.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

At Skillsoft, we are dedicated to propelling organizations and individuals towards growth through transformative learning experiences. We firmly believe in the potential of each team member to achieve greatness. If you share our passion for revolutionizing learning and helping individuals unlock their full potential, we invite you to join us on this exciting journey. We are currently seeking a proactive and innovative Microsoft Copilot Studio AI Engineer to become a valuable part of our expanding innovation team. As the chosen candidate, you will be responsible for leading the design, development, and implementation of intelligent agents using Microsoft Copilot Studio and Azure AI tools. Your role will involve collaborating closely with cross-functional teams to automate business workflows, enhance productivity, and deliver exceptional user experiences across various platforms such as Microsoft 365, Salesforce, Workday, Dynamics 365, and more. This position presents a unique opportunity to influence AI transformation on an enterprise scale by leveraging low-code/no-code tools in conjunction with advanced AI orchestration through Azure AI Studio, Foundry, and OpenAI. If you are shortlisted, you will be invited to participate in a comprehensive 2-hour interview round, including a presentation highlighting your real-world experience in AI or Agentic AI projects, focusing on agent design, prompt engineering, and enterprise impact. As a Microsoft Copilot Studio AI Engineer, your key responsibilities will include: - Collaborating with stakeholders to identify business needs and define Copilot use cases. - Designing, building, and optimizing intelligent agents using Microsoft Copilot Studio, Azure AI Studio, and Azure AI Foundry. - Integrating agents into enterprise ecosystems such as Microsoft 365, Power Platform, Salesforce, Workday, and others via APIs and connectors. - Deploying agents across various channels, including Microsoft Teams, Slack, and web platforms. - Enhancing automation using Azure Logic Apps, Azure Functions, and Azure Cognitive Services. - Monitoring and analyzing performance using Azure Monitor, Application Insights, and feedback telemetry. - Designing human-in-the-loop (HITL) workflows to ensure accuracy, compliance, and trust. - Applying Responsible AI principles in the development, testing, and deployment processes. - Creating reusable templates, connectors, and governance frameworks to facilitate agent scalability. - Staying informed about Microsoft Copilot platform updates and industry best practices. - Rapidly prototyping and iterating in sandbox environments to test new agent capabilities. Required Skills & Expertise: Core Engineering Skills: - Hands-on experience with Microsoft Copilot Studio, Power Platform, and Azure AI services. - Proficiency in building solutions with Azure AI Foundry, Azure OpenAI, and Cognitive Services. - Strong programming background in C#, JavaScript, Python, or equivalent. - Ability to work with large datasets and BI tools for telemetry and feedback integration. AI/LLM Expertise: - Proficiency in working with large language models (LLMs) and prompt engineering. - Deep understanding of agent orchestration patterns, including memory management and feedback loops. - Experience in designing and deploying intelligent copilots using Azure AI Studio or Copilot Studio. Enterprise Integration: - Experience integrating with enterprise systems like Salesforce, Workday, and Microsoft 365. - Knowledge of API development, RBAC, data governance, and enterprise-grade security. Preferred Qualifications: - Familiarity with Microsoft 365 Copilot, Copilot connectors, and multi-channel AI deployments. - Understanding of conversational UX, agent lifecycle management, and middleware integration. - Relevant Microsoft certifications such as PL-500, AI-102, or Power Platform Solution Architect. - Bachelor's degree in computer science, Engineering, or a related technical discipline. Join us at Skillsoft, where we provide online learning, training, and talent solutions to empower organizations and individuals. With a focus on immersive and engaging content, we help organizations unleash the potential of their workforce and equip teams with the essential skills for success. As a partner to numerous global organizations, including Fortune 500 companies, Skillsoft offers award-winning systems that support learning, performance, and success. Explore more about us at www.skillsoft.com. Thank you for considering this opportunity with us. If you are intrigued by this role and our mission, we encourage you to submit your application. Please note that we do not accept unsolicited resumes from employment agencies. All submissions must adhere to our guidelines as outlined in our policy.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

The Data Technical Consultant is an expert in technical and functional aspects of customer and partner engagements, leading to successful data management project delivery. As a Data Architect, you play a critical role in setting customers up for success by shaping and executing projects in the Salesforce data space. Your expertise in data management solutions ensures project success. With over 6 years of experience in complex data projects, including migrations, integrations, data architecture, and governance, you have the ability to translate high-level requirements into solutions without detailed specifications. Your proficiency with ETL tools such as SSIS, Boomi, or Informatica Power Centre, along with skills in Data Quality, Data Security, and Data Governance, sets you apart as a technology leader. You are adept at PL/SQL query writing, have a strong relational database background, and possess basic understanding of DBamp. Experience with tools like SF Data Loader, SOQL, and BI tools such as Tableau and Qlik is valuable. Knowledge in master data management, Salesforce.com, enterprise analytics, big data platforms, and CRM is preferred. Responsibilities include eliciting data requirements, data modeling, providing data governance leadership, conducting data audits and analysis, developing ETL processes, and driving technology architecture solutions. You will lead projects, mentor team members, and collaborate with cross-functional teams to deliver features. At the team level, you support career development, mentor team members, guide new hires, and motivate the team. You collaborate with clients, resolve technical dependencies, and drive common vision and capabilities across teams. Your role involves delivering Salesforce.com projects, participating in guilds, and ensuring alignment with company objectives. Other requirements include maintaining positive client relationships, adapting to challenging environments and timelines, willingness to learn new technologies, and strong communication and presentation skills. The ability to manage multiple projects, assess technical impacts, and provide constructive feedback is essential. Certification requirements include Salesforce Certified Administrator (201) and Salesforce Certified Sales Cloud Consultant or Service Cloud Consultant certifications.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

When you mentor and advise multiple technical teams and advance financial technologies, you face a significant challenge with a profound impact. Your innate abilities align perfectly with this role. As a Senior Manager of Software Engineering at JPMorgan Chase in the Consumer and Community Banking-Data Technology division, you play a crucial leadership position. Your responsibilities include providing technical guidance and coaching to multiple teams while anticipating the needs and dependencies of various functions within the firm. Your expertise influences budget decisions and technical considerations to enhance operational efficiencies and functionalities. You will: - Provide overall direction, oversight, and coaching for a team of entry-level to mid-level software engineers handling basic to moderately complex tasks. - Take accountability for decisions affecting the team's resources, budget, tactical operations, and the execution of processes. - Ensure successful collaboration among teams and stakeholders, identifying and addressing issues while escalating them when necessary. - Offer input to leadership on budget, approach, and technical considerations to enhance operational efficiencies and team functionality. - Foster a culture of diversity, equity, inclusion, and respect within the team, emphasizing diverse representation. - Enable the Gen AI platform and implement Gen AI Use cases, LLM fine-tuning, and multi-agent orchestration. - Manage an AIML Engineering scrum team consisting of ML engineers, Senior ML engineers, and lead ML engineers. Required qualifications, capabilities, and skills: - Formal training or certification in software engineering concepts and a minimum of 5 years of applied experience. - Extensive practical experience with Python and AWS cloud services, including EKS, EMR, ECS, and DynamoDB. - Proficiency in DataBricks ML lifecycle development. - Advanced knowledge in software engineering, AI/ML, MLOps, and data governance. - Demonstrated experience leading complex projects, system design, testing, and ensuring operational stability. - Expertise in computer science, computer engineering, mathematics, or related technical fields. - Understanding of large language model (LLM) approaches like Retrieval-Augmented Generation (RAG). Preferred qualifications, capabilities, and skills: - Real-time model serving experience with Seldon, Ray, or AWS SM. - Experience in agent-based model development. This role offers an exciting opportunity to lead and influence cutting-edge technology projects within a dynamic and innovative environment.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies