Home
Jobs

1411 Data Governance Jobs - Page 25

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

20 - 30 Lacs

Ahmedabad

Hybrid

Naukri logo

Compare and match data between systems; investigate and fix mismatches. Help build dashboards, support audits, and maintain clear documentation. Manage new data entries, ensure accuracy, and oversee smooth data processes. Required Candidate profile 3 to 5 years experience Experience of Data Governance and systems related DG Confidence in using applications, some systems experience - SAP, HFM, Oracle, Snowflake, Autonomy to review and research

Posted 2 weeks ago

Apply

3.0 - 7.0 years

8 - 14 Lacs

Ahmedabad

Work from Office

Naukri logo

3–5 yrs exp in data reconciliations (Catalyst/Keystone to GFIN, GFIN vs HFM), dashboard support (Power BI), audit support, and data governance (MDG). Proactive, Excel-savvy, system-fluent (SAP, HFM, Oracle), with strong analytical and comms skills.

Posted 2 weeks ago

Apply

10.0 - 20.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: SAP DATA STEWARD (CONTRACT) Location: HYDERABAD, INDIA As the SAP Data Steward is responsible for creating, maintaining, and deactivating master data and data attributes in SAP with focus on data migration. The Data Stewards has an essential role in establishing in monitoring existing and new data against and ensuring timely and high-quality creation of new data in the system. Bachelors Degree or Associates degree with additional 9+ years of work experience required or an equivalent combination of education and experience. Requires SAP functional knowledge on SAP Routings with Migration Perspective. Should have complete knowledge on SAP Routings tables. Need to have basic knowledge on linking between the tables and joins needed for getting the extract template ready as per Client's Format. Excellent attention to detail, exceptional interest in creating order and consistency required. 10+ years of experience in data management and/or data governance activities and responsibilities. Experience working with SAP ECC required. Demonstrated expert-level experience and capability with MS Excel required. High degree of initiative and ownership, as well as a proven history of delivering results while working with several different departments in a fast-paced environment required. Experience creating and running business reports and data queries is preferred. Confident user of Microsoft Office (Word, Excel, Outlook, PowerPoint, Teams). Experience working with teams across multiple functions. Ability to multi-task and work under tight timelines required. Excellent communication skills both verbal and written.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

11 - 21 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities :- 1. Solution Design & Implementation: Configure and implement SAP HCM modules, including Personnel Administration (PA), Organizational Management (OM), Time Management, Payroll, and Employee Self- Service (ESS)/Manager Self-Service (MSS). Develop customized solutions for complex HR and payroll requirements, including statutory compliance and reporting. Integrate SAP HCM with third-party systems for payroll, benefits, and time tracking. 2. Support & Optimization: Provide end-to-end support for SAP HCM modules, addressing user queries, system issues, and enhancements. Optimize existing configurations and processes to improve system performance and user experience. 3. Cross-Module Integration: Ensure seamless integration of HCM modules with other SAP modules such as FI/CO, Success Factors and SAP Fiori. Collaborate with technical teams to implement interfaces, reports, and workflows. 4. Emerging Technology Adoption: Support and configure SAP SuccessFactors Employee Central and its integration with SAP HCM Leverage SAP Fiori apps to enhance the user experience for HR and payroll processes. 5. Stakeholder Collaboration: Collaborate with HR business teams to gather requirements, translate them into technical specifications, and deliver effective solutions. Act as a bridge between the technical and functional teams, ensuring smooth project execution. 6. Data Governance & Reporting: Ensure accurate and secure management of employee data in SAP systems. Develop and maintain reports using tools like SAP Query, Ad Hoc Reporting, or ABAP Reports. Core Must-Have Skills: • Expertise in SAP HCM modules, including: Personnel Administration (PA) Organizational Management (OM) Time Management Payroll (local and global compliance) Employee Self-Service (ESS)/Manager Self-Service (MSS) • Strong configuration and customization experience for statutory payroll and time evaluation. • Knowledge of integration with SAP FI/CO for payroll posting and reconciliations. • Experience with implementing and supporting SAP SuccessFactors Employee Central and Recruiting/Onboarding modules Hands-on experience with HR Renewal functionalities and SAP Fiori for HR processes Desirable: Good-to-Have Skills: • Familiarity with SAP BTP for extending HR functionalities. • Understanding of Talent Management Suite (Learning, Performance, Succession Planning). • Experience with implementing global payroll solutions for multi- geography operations. • Proficiency in developing custom HR reports using ABAP HR or SAP Analytics Cloud (SAC). Market Standard Expectations 1. Certifications: SAP HCM or SAP SuccessFactors certifications Payroll certification specific to regional compliance (e.g., Nordic). 2. Project Experience: Exposure to end-to-end SAP HCM implementation and upgrade projects. Hands-on experience with SAP ECC to S/4HANA migration projects. 3. Emerging Technologies: Knowledge of AI/ML-driven HR solutions integrated with SAP systems. Experience in leveraging robotic process automation (RPA) for HR workflows.

Posted 2 weeks ago

Apply

9.0 - 14.0 years

10 - 20 Lacs

Hyderabad, Bengaluru

Hybrid

Naukri logo

Job Title Atlan Data Governance Implementation Engineer (Hands-On Role) Key Responsibilities Atlan Deployment & Connect Required Skills and Qualifications Minimum 8+ years of relevant experience in data governance, data analytics, or related fields. Strong understanding of data governance principles and best practices. Experience with data profiling, validation, and cleansing tools and techniques. Ability to analyze data, identify patterns, and recommend solutions. Excellent communication and interpersonal skills. Ability to work independently and collaboratively in a team environment. Hands-on experience with relational databases and data warehouses. Knowledge of data quality metrics, dashboards, and reporting. Experience with data governance tools and technologies. Strong analytical and problem-solving skills. Proficient in Python, REST API development, Pandas, and Numpy. Experience with cloud platforms such as Azure, GCP, or AWS. Proficient in SQL and RDBMS. Let me know if you want me to tailor this further for any specific use!

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Hyderabad, Bengaluru

Hybrid

Naukri logo

Job Description We are seeking a highly skilled Azure Data Engineer with strong expertise in Data Architecture , PySpark/Python, Azure Databricks, and data streaming solutions . The ideal candidate will have hands-on experience in designing and implementing large-scale data pipelines, along with solid knowledge of data governance and data modeling . Key Responsibilities Design, develop, and optimize PySpark/Python-based data streaming jobs on Azure Databricks . Build scalable and efficient data pipelines for batch and real-time processing. Implement data governance policies, ensuring data quality, security, and compliance. Develop and maintain data models (dimensional, relational, NoSQL) to support analytics and reporting. Collaborate with cross-functional teams (data scientists, analysts, and business stakeholders) to deliver data solutions. Troubleshoot performance bottlenecks and optimize Spark jobs for efficiency. Ensure best practices in CI/CD, automation, and monitoring of data workflows. Mentor junior engineers and lead technical discussions (for senior/managerial roles). Mandatory Skills & Experience 5+ years of relevant experience as a Data Engineer/Analyst/Architect (8+ years for Manager/Lead positions). Expert-level proficiency in PySpark/Python and Azure Databricks (must have worked on real production projects ). Strong experience in building and optimizing streaming data pipelines (Kafka, Event Hubs, Delta Lake, etc.). 4+ years of hands-on experience in data governance & data modeling (ER, star schema, data vault, etc.). In-depth knowledge of Azure Data Factory, Synapse, ADLS, and SQL/NoSQL databases . Experience with Delta Lake, Databricks Workflows, and performance tuning . Familiarity with data security, metadata management, and lineage tracking . Excellent communication skills (must be able to articulate technical concepts clearly).

Posted 2 weeks ago

Apply

3.0 - 8.0 years

9 - 16 Lacs

Pune

Work from Office

Naukri logo

We are looking for a skilled Azure Data Engineer to design, develop, optimize data pipelines for following 1, SQL+ETL+AZURE+Python+Pyspark+Databricks 2, SQL+ADF+ Azure 3, SQL+Python+Pyspark - Strong proficiency in SQL for data manipulation querying Required Candidate profile - Python and PySpark for data engineering tasks. - Exp with Databricks for big data processing analytics. - Knowledge of data modeling, warehousing, governance. - CI/CD pipelines for data deployment. Perks and benefits Perks and Benefits

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do In this vital role you will be a Senior Manager BI & Visualization to lead and drive enterprise-wide business intelligence (BI) and data visualization initiatives. This role will be responsible for strategic planning, governance, and execution of BI and analytics solutions, ensuring that business leaders have access to actionable insights through advanced reporting and visualization platforms. The ideal candidate will have a deep understanding of BI tools, data visualization best practices, self-service BI enablement, and enterprise analytics strategy, working closely with business executives, data teams, and technology leaders to foster a data-driven culture. Develop and execute a strategic BI & visualization roadmap, aligning with business goals, analytics objectives, and digital transformation strategies. Lead and mentor BI, analytics, and visualization teams, fostering a culture of innovation, collaboration, and continuous learning. Own the end-to-end BI lifecycle, including data modeling, dashboard development, analytics governance, and self-service BI adoption. Oversee the implementation of modern BI solutions, leveraging tools like Power BI, Tableau, Looker, Qlik Sense, or similar to deliver high-impact visual insights. Define and enforce data visualization best practices, ensuring dashboards are intuitive, user-friendly, and business-focused. Drive self-service BI enablement, empowering business users to explore, analyze, and act on data independently while maintaining data security and governance. Collaborate with business leaders, data scientists, and engineering teams to identify and prioritize high-value analytics use cases. Optimize BI infrastructure and reporting architecture, ensuring scalability, performance, and cost efficiency. Establish BI governance frameworks, defining data access controls, security policies, KPI standardization, and metadata management. Champion the use of AI/ML-powered BI solutions, enabling predictive analytics, anomaly detection, and natural language-driven insights. Monitor BI performance metrics, ensuring reporting solutions meet business SLAs, operational efficiency, and data accuracy. Stay ahead of emerging trends in BI, data visualization, and analytics automation, ensuring the company remains competitive in its data strategy. What we expect of you Masters degree and 8 to 10 years of experience in Computer Science, IT or related field OR Bachelors degree and 10 to 14 years of experience in Computer Science, IT or related field OR Diploma and 14 to 18 years of experience in Computer Science, IT or related field. Certifications on PowerBI / Any other visualization tools Basic Qualifications: 10-14 + years of experience in BI, analytics, and data visualization, with at least 5 years in a leadership role. Expertise in BI tools, including Power BI, Tableau, Looker, Qlik Sense similar enterprise BI platforms. Strong proficiency in data visualization principles, storytelling with data, and dashboard usability best practices. Experience in leading large-scale BI transformation initiatives, driving self-service analytics adoption across an enterprise. Strong knowledge of data modeling, dimensional modeling (star/snowflake schema), and data warehousing concepts. Hands-on experience with SQL, DAX, Power Query (M), or other analytics scripting languages. Strong background in BI governance, data security, compliance, and metadata management. Ability to influence senior leadership, communicate insights effectively, and drive business impact through BI. Excellent problem-solving skills, with a track record of driving efficiency, automation, and data-driven decision-making. Preferred Qualifications: Experience in Biotechnology or pharma industry is a big plus Experience with Data Mesh, Data Fabric, or Federated Data Governance models. Experience with AI/ML-driven BI solutions, predictive analytics, and NLP-based BI capabilities. Knowledge infrastructure & deploayment automation for visualization platforms. Experience integrating BI with ERP, CRM, and operational systems (SAP, Salesforce, Oracle, Workday, etc.). Familiarity with Agile methodologies and Scaled Agile Framework (SAFe) for BI project delivery. Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 2 weeks ago

Apply

10.0 - 16.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

If interested, please apply on the link : https://forms.office.com/r/3Qjxw7hwYj We are currently seeking a highly capable professional to join our team to who can lead Data Governance programs and initiatives like designing governance frameworks, policies and standards, implement governance strategies, data quality, security & compliance and privacy. As a successful Data Governance professional you will be responsible for : Define and drive the Data Governance Programmes and initiatives. Designing Data Governance and Data Quality Frameworks. Drive and Manage the implementation of Data Governance, Quality and Data Catalog projects. Hire and enable the team on Data governance. Strong verbal and written communication to Interface with varied senior stakeholders to convey and establish POV. Understand the Regulatory Landscape as per client requirements and collaborate to maintain enterprise data policies and standards. Design test solutions to ensure data quality principles are maintained as may be required. Design and deliver effective and actionable insights on data privacy and risk, proactively identifying opportunities to reduce risk through timely action and mitigate risk related to data management. Develop, Implement and Promote Governance Strategies, ensure quality, security & compliance and privacy of data assets. Be the Governance champion to lead the data steward, data owners and data users to communicate and reinforce the importance of data governance and support them to succeed in their role. Collaborate with cross-functional teams to define the governance policies, roles and responsibilities. Manage and report on the compliance with Data Privacy policies and escalation of issues and concerns Drive and Manage the implementation of data governance tools and technologies to support data catalogs, business glossary, technical & business meta data management, data lineage, metadata management, active metadata management and data quality monitoring. Required Skills: A Masters Degree in Business, Engineering, Sciences, IT, Computer Sciences or Statistics 12 to 16 years of Information systems experience along with 10 years of experience in data governance, data management, delivering data governance projects/program. CDMP Certification is mandatory. Strong understanding of data governance frameworks, methodologies, and industry standards like DAMA, GDPR, CCPA, DPDPA, BCBS 239, etc. Aware of industry trends and priorities and can apply to governance and policies. Interest in latest trends in AI & Data with Data Governance platforms. In-depth knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. Strong experience and knowledge of data management tools and technologies, such as data governance platforms, data cataloging tools, data quality, lineage tools, security and Privacy tools. Strong team spirit, balanced by a healthy sense of autonomy Excellent communication and interpersonal skills, with the ability to effectively collaborate with stakeholders at all levels of the organization. Hands-on experience on any of the following Data Governance tools: Collibra, Atlan, Alation, Datahub, IDGC. Knowledge of relational databases, SQL is a must. Take responsibilities and Manage complex projects and be the trusted advisor of the clients and consultants in the Organization. Ability to work in a complex and constantly changing environment.

Posted 2 weeks ago

Apply

12.0 - 17.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 326459 We are currently seeking a BFSI Data and Analytics Project Lead - CITI Bank to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesData & Analytics Project Lead with over 12+ years of experience in BFSI Domain The Data and Analytics Delivery Manager will oversee the successful delivery of the Client's data and analytics projects, ensuring our clients derive maximum value from their data assets. This leadership role involves setting strategy, managing delivery teams, collaborating across functions, and upholding data governance and quality standards. The ideal candidate brings strong technical and business acumen to build and execute data-driven strategies aligned with the Client's mission of transforming their business with data-driven insights. The core responsibilities for the job include the following: Project and Program Oversight: "¢ Oversee end-to-end delivery of complex data and analytics projects, ensuring timely, high-quality, and cost-effective outcomes. "¢ Establish project governance, risk management, and quality assurance standards for effective project delivery. "¢ Monitor project portfolios and allocate resources to optimize productivity across multiple client engagements. Stakeholder Collaboration and Engagement: "¢ Serve as the primary liaison between data delivery teams, sales, product, and client-facing teams to ensure client needs are met. "¢ Present data strategies, project status, and insights effectively to both technical and non-technical stakeholders, fostering alignment. "¢ Drive collaboration with product management and engineering teams to align on data needs and operational goals. Innovation and Technology Adoption: "¢ Stay abreast of the latest trends in GenAI, Agentic AI in data engineering, data science, machine learning, and AI to enhance the Client's data capabilities. "¢ Must have experience in Cloud Modernization, DWH, Datalake project execution "¢ Drive the adoption of advanced analytics tools and technologies to improve data delivery efficiency and solution impact. "¢ Assess and recommend new tools, platforms, and partners to continuously improve data solutions. Team Development and Leadership: "¢ Recruit, mentor, and retain a high-performing data and analytics team, fostering a culture of collaboration and continuous improvement. "¢ Set performance goals, conduct regular evaluations, and provide ongoing feedback to support team growth. Minimum Skills Required: "¢ Educational BackgroundBachelor's or Master's degree in Data Science, Computer Science, Business Administration, or a related field. "¢ Experience15+ years of experience in data and analytics, including at least 5 years in a leadership role with a proven track record in delivery management. "¢ Technical ProficiencyDeep understanding of data warehousing, data visualization, data governance, data trust and big data tools (SQL, Python, R, Tableau, Power BI, and cloud platforms like AWS, Azure, or Google Cloud). "¢ Must have experience in Cloud Modernization, DWH, Datalake project execution "¢ BFSI KnowledgeMandatory to have worked in BFSI projects delivered Data & Analytics projects to BFSI clients "¢ Project Management ExpertiseStrong background in Agile, Scrum, or other project management methodologies. "¢ Leadership and CommunicationExcellent interpersonal and communication skills, with a demonstrated ability to lead, influence, and engage stakeholders at all levels. "¢ Analytical and Problem-Solving Skills: Strong analytical mindset with a track record of delivering actionable insights from complex data The Data and Analytics Delivery Manager will oversee the successful delivery of the Client's data and analytics projects, ensuring our retail clients derive maximum value from their data assets. This leadership role involves setting strategy, managing delivery teams, collaborating across functions, and upholding data governance and quality standards. The ideal candidate brings strong technical and business acumen to build and execute data-driven strategies aligned with the Client's mission of transforming retail with data-driven insights."

Posted 2 weeks ago

Apply

4.0 - 8.0 years

14 - 19 Lacs

Chennai, Gurugram, Bengaluru

Work from Office

Naukri logo

We are currently seeking a Salesforce Data Cloud Architect to join our team in "‹"‹"‹"‹"‹"‹"‹Hyderabad, Telangana, India. Salesforce Data Cloud Expertise: Extensive knowledge of Salesforce Data Cloud features, capabilities, and best practices. Data Modeling: Strong experience in designing and implementing data models. Data Integration: Experience with data integration tools and techniques. Data Quality: Understanding of data quality concepts and practices. Data Governance: Knowledge of data governance principles and practices. SQL: Proficiency in SQL for data querying and manipulation. Problem-Solving: Strong analytical and problem-solving skills. Communication: Excellent communication and collaboration skills. Location - Bengaluru,Chennai,Gurugram,Hyderabad,Noida,Pune

Posted 2 weeks ago

Apply

18.0 - 23.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 326457 We are currently seeking a BFSI Data and Analytics Delivery Manager to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesData & Analytics Delivery Manager with over 18+ years of experience in BFSI Domain The Data and Analytics Delivery Manager will oversee the successful delivery of the Client's data and analytics projects, ensuring our clients derive maximum value from their data assets. This leadership role involves setting strategy, managing delivery teams, collaborating across functions, and upholding data governance and quality standards. The ideal candidate brings strong technical and business acumen to build and execute data-driven strategies aligned with the Client's mission of transforming their business with data-driven insights. The core responsibilities for the job include the following: Project and Program Oversight: "¢ Oversee end-to-end delivery of complex data and analytics projects, ensuring timely, high-quality, and cost-effective outcomes. "¢ Establish project governance, risk management, and quality assurance standards for effective project delivery. "¢ Monitor project portfolios and allocate resources to optimize productivity across multiple client engagements. Stakeholder Collaboration and Engagement: "¢ Serve as the primary liaison between data delivery teams, sales, product, and client-facing teams to ensure client needs are met. "¢ Present data strategies, project status, and insights effectively to both technical and non-technical stakeholders, fostering alignment. "¢ Drive collaboration with product management and engineering teams to align on data needs and operational goals. Innovation and Technology Adoption: "¢ Stay abreast of the latest trends in GenAI, Agentic AI in data engineering, data science, machine learning, and AI to enhance the Client's data capabilities. "¢ Must have experience in Cloud Modernization, DWH, Datalake project execution "¢ Drive the adoption of advanced analytics tools and technologies to improve data delivery efficiency and solution impact. "¢ Assess and recommend new tools, platforms, and partners to continuously improve data solutions. Team Development and Leadership: "¢ Recruit, mentor, and retain a high-performing data and analytics team, fostering a culture of collaboration and continuous improvement. "¢ Set performance goals, conduct regular evaluations, and provide ongoing feedback to support team growth. Minimum Skills Required: "¢ Educational BackgroundBachelor's or Master's degree in Data Science, Computer Science, Business Administration, or a related field. "¢ Experience15+ years of experience in data and analytics, including at least 5 years in a leadership role with a proven track record in delivery management. "¢ Technical ProficiencyDeep understanding of data warehousing, data visualization, data governance, data trust and big data tools (SQL, Python, R, Tableau, Power BI, and cloud platforms like AWS, Azure, or Google Cloud). "¢ Must have experience in Cloud Modernization, DWH, Datalake project execution "¢ BFSI KnowledgeMandatory to have worked in BFSI projects delivered Data & Analytics projects to BFSI clients "¢ Project Management ExpertiseStrong background in Agile, Scrum, or other project management methodologies. "¢ Leadership and CommunicationExcellent interpersonal and communication skills, with a demonstrated ability to lead, influence, and engage stakeholders at all levels. "¢ Analytical and Problem-Solving Skills: Strong analytical mindset with a track record of delivering actionable insights from complex data The Data and Analytics Delivery Manager will oversee the successful delivery of the Client's data and analytics projects, ensuring our retail clients derive maximum value from their data assets. This leadership role involves setting strategy, managing delivery teams, collaborating across functions, and upholding data governance and quality standards. The ideal candidate brings strong technical and business acumen to build and execute data-driven strategies aligned with the Client's mission of transforming retail with data-driven insig"

Posted 2 weeks ago

Apply

5.0 - 10.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 324162 We are currently seeking a Python Engineer with AWS and Java to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job DutiesMorgan Stanley is seeking a highly skilled Senior Python Developer with over 5 years of experience to join our team in developing a state-of-the-art electronic communications surveillance system. This system will monitor all voice communications, chats, and email messages of employees across the firm, ensuring compliance and security. The ideal candidate will have a proven track record in writing high-performance, low-latency code capable of processing millions of messages daily, with expertise in Python, a solid understanding of data structures, design patterns, and familiarity with Java. Responsibilities "¢ Design, develop, and implement a robust surveillance system from the ground up to monitor electronic communications in real-time. "¢ Write high-performance, low-latency Python code to handle large-scale message processing (millions of messages per day). "¢ Collaborate with cross-functional teams to define system architecture and ensure scalability, reliability, and maintainability. "¢ Optimize data processing pipelines using Apache Kafka for real-time message streaming. "¢ Leverage Amazon AWS for cloud-based infrastructure, ensuring secure and efficient deployment. "¢ Design and maintain database schemas in Postgres SQL for efficient data storage and retrieval. "¢ Integrate Collibra for data governance and metadata management. "¢ Utilize Airflow for workflow orchestration and scheduling. "¢ Implement CI/CD pipelines using Jenkins and manage containerized applications with Docker. "¢ Use Artifactory for artifact management and dependency tracking. "¢ Apply advanced knowledge of data structures and design patterns to create clean, modular, and reusable code. "¢ Contribute to code reviews, testing, and documentation to maintain high-quality standards. Minimum Skills Required"¢ Experience5+ years of professional software development experience, with a focus on Python. "¢ Technical Skills: o Expertise in writing high-performance, low-latency Python code for large-scale systems. o Strong understanding of data structures, algorithms, and design patterns. o Familiarity with Java for cross-language integration and support. o Hands-on experience with Apache Kafka for real-time data streaming. o Proficiency in Amazon AWS services (e.g., EC2, S3, Lambda, RDS). o Experience with Postgres SQL for relational database management. o Knowledge of Collibra for data governance (preferred). o Familiarity with Apache Airflow for workflow orchestration. o Experience with Jenkins CI for continuous integration and deployment. o Proficiency in Docker for containerization and Artifactory for artifact management. "¢ Soft Skills: o Strong problem-solving skills and attention to detail. o Ability to work independently and collaboratively in a fast-paced environment. o Excellent communication skills to articulate technical concepts to non-technical stakeholders. "¢ EducationBachelor"™s degree in Computer Science, Engineering, or a related field (or equivalent experience). Preferred Qualifications "¢ Experience in financial services or compliance systems. "¢ Familiarity with surveillance or monitoring systems for voice, chat, or email communications. Knowledge of regulatory requirements in the financial industry

Posted 2 weeks ago

Apply

10.0 - 15.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 323226 We are currently seeking a Digital Solution Architect Sr. Advisor to join our team in Bengaluru, India, Karntaka (IN-KA), India (IN). Key Responsibilities: Design data platform architectures (data lakes, lakehouses, DWH) using modern cloud-native tools (e.g., Databricks, Snowflake, BigQuery, Synapse, Redshift). Architect data ingestion, transformation, and consumption pipelines using batch and streaming methods. Enable real-time analytics and machine learning through scalable and modular data frameworks. Define data governance models, metadata management, lineage tracking, and access controls. Collaborate with AI/ML, application, and business teams to identify high-impact use cases and optimize data usage. Lead modernization initiatives from legacy data warehouses to cloud-native and distributed architectures. Enforce data quality and observability practices for mission-critical workloads. Required Skills: 10+ years in data architecture, with strong grounding in modern data platforms and pipelines. Deep knowledge of SQL/NoSQL, Spark, Delta Lake, Kafka, ETL/ELT frameworks. Hands-on experience with cloud data platforms (AWS, Azure, GCP). Understanding of data privacy, security, lineage, and compliance (GDPR, HIPAA, etc.). Experience implementing data mesh/data fabric concepts is a plus. Expertise in technical solutions writing and presenting using tools such as Word, PowerPoint, Excel, Visio etc. High level of executive presence to be able to articulate the solutions to CXO level executives. Preferred Qualifications: Certifications in Snowflake, Databricks, or cloud-native data platforms. Exposure to AI/ML data pipelines, MLOps, and real-time data applications. Familiarity with data visualization and BI tools (Power BI, Tableau, Looker, etc.).

Posted 2 weeks ago

Apply

7.0 - 12.0 years

16 - 20 Lacs

Pune

Work from Office

Naukri logo

Req ID: 301930 We are currently seeking a Digital Solution Architect Lead Advisor to join our team in Pune, Mahrshtra (IN-MH), India (IN). Position Overview We are seeking a highly skilled and experienced Data Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Key Responsibilities - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS - Design and implement data streaming pipelines using Kafka/Confluent Kafka - Develop data processing applications using Python - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Provide technical leadership and mentorship to development teams - Stay current with emerging technologies and industry trends Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Proficiency in Kafka/Confluent Kafka and Python - Experience with Synk for security scanning and vulnerability management - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders

Posted 2 weeks ago

Apply

8.0 - 13.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 327606 We are currently seeking a Enterprise / Platform Architect to join our team in Bnagalore, Karntaka (IN-KA), India (IN). ServiceNow Technical Architect Experience Level: 8+ years in ServiceNow development & solutioning Domain Expertise: ITSM, ITOM, ITAM, ServiceNow Architecture, Performance Optimization Role Overview: As a ServiceNow Technical Architect, you will be responsible for designing and implementing complex ServiceNow solutions with a strong focus on scalability, best practices, and architecture. You will lead technical design discussions, guide junior developers, and ensure adherence to ServiceNow development standards. Key Responsibilities: "¢ Architect, develop, and implement end-to-end ServiceNow solutions across ITSM, ITOM, ITAM. "¢ Lead complex scripting, automation, and custom application development. "¢ Define and enforce coding standards, security best practices, and governance frameworks. "¢ Implement advanced scripting using Glide APIs, JavaScript, AngularJS, and Jelly scripting. "¢ Design and optimize Flow Designer, Orchestration, and business process automation. "¢ Ensure optimal instance performance, security, and compliance. "¢ Lead integrations using IntegrationHub, REST/SOAP APIs, Service Graph Connector. "¢ Design enterprise-wide ServiceNow architecture, including data flows and integrations. "¢ Provide technical leadership, mentoring junior developers, and conducting code reviews. "¢ Define best practices for CI/CD pipelines, automated testing, and instance maintenance. Required Skills & Qualifications: "¢ 8+ years of hands-on ServiceNow experience, with expertise in ITSM, ITOM, ITAM. "¢ Strong understanding of ServiceNow architecture, scalability, and security best practices. "¢ Deep expertise in JavaScript, Glide APIs, Flow Designer, and Orchestration. "¢ Experience in ServiceNow integrations, including API security, OAuth, and third-party connectors. "¢ Experience with ServiceNow performance tuning, upgrade strategies, and DevOps pipelines. "¢ Strong leadership, mentoring, and stakeholder communication skills. Preferred Qualifications: "¢ ServiceNow CIS (Certified Implementation Specialist) in ITSM, ITOM, ITAM. "¢ ServiceNow CAD (Certified Application Developer) certification is a plus. "¢ Experience in ServiceNow CMDB health, data governance, and Service Mapping.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

We are currently seeking a Lead Data Architect to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Key Responsibilities - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem - Architect data processing applications using Python, Kafka, Confluent Cloud and AWS - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams - Provide technical leadership and mentorship to development teams and lead engineers - Stay current with emerging technologies and industry trends Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Strong experience with Confluent - Strong experience in Kafka - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Knowledge of Apache Airflow for data orchestration Preferred Qualifications - An understanding of cloud networking patterns and practises - Experience with working on a library or other long term product - Knowledge of the Flink ecosystem - Experience with Terraform - Deep experience with CI/CD pipelines - Strong understanding of the JVM language family - Understanding of GDPR and the correct handling of PII - Expertise with technical interface design - Use of Docker Responsibilities - Design and implement scalable data architectures using AWS services, Confluent and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture

Posted 2 weeks ago

Apply

8.0 - 10.0 years

17 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Prophecy.AI.

Posted 2 weeks ago

Apply

7.0 - 11.0 years

14 - 36 Lacs

Pune

Work from Office

Naukri logo

Location: Open | Experience: 7+ years in AI/FinTech Apply: recruitment@fortitudecareer.com Were looking for a visionary AI Product Manager to turn big ideas into powerful, responsible AI products. Banking Domain is essential Work from home Flexi working

Posted 2 weeks ago

Apply

7.0 - 12.0 years

30 - 35 Lacs

Mumbai

Work from Office

Naukri logo

Lead & manage the Sales Pro System, ensuring adoption, performance analytics, & system enhancements to support insurance sales productivity & compliance, collaborate with teams to system capabilities & ensure data accuracy & regulatory compliance.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

15 - 25 Lacs

Pune

Hybrid

Naukri logo

About This Role : We are looking for a talented and experienced Data Engineer with Tech Lead with hands-on expertise in any ETL Tool with full knowledge about CI/CD practices with leading a team technically more than 5 and client facing and create Data Engineering, Data Quality frameworks. As a tech lead must ensure to build ETL jobs, Data Quality Jobs, Big Data Jobs performed performance optimization by understanding the requirements, create re-usable assets and able to perform production deployment and preferably worked in DWH appliances Snowflake / redshift / Synapse Responsibilities Work with a team of engineers in designing, developing, and maintaining scalable and efficient data solutions using Any Data Integration (any ETL tool like Talend / Informatica) and any Big Data technologies. Design, develop, and maintain end-to-end data pipelines using Any ETL Data Integration (any ETL tool like Talend / Informatica) to ingest, process, and transform large volumes of data from heterogeneous sources. Have good experience in designing cloud pipelines using Azure Data Factory or AWS Glues/Lambda. Implemented Data Integration end to end with any ETL technologies. Implement database solutions for storing, processing, and querying large volumes of structured and unstructured and semi-structured data Implement Job Migrations of ETL Jobs from Older versions to New versions. Implement and write advanced SQL scripts in SQL Database at medium to expert level. Work with technical team with client and provide guidance during technical challenges. Integrate and optimize data flows between various databases, data warehouses, and Big Data platforms. Collaborate with cross-functional teams to gather data requirements and translate them into scalable and efficient data solutions. Optimize ETL, Data Load performance, scalability, and cost-effectiveness through optimization techniques. Interact with Client on a daily basis and provide technical progress and respond to technical questions. Implement best practices for data integration. Implement complex ETL data pipelines or similar frameworks to process and analyze massive datasets. Ensure data quality, reliability, and security across all stages of the data pipeline. Troubleshoot and debug data-related issues in production systems and provide timely resolution. Stay current with emerging technologies and industry trends in data engineering technologies, CI/CD, and incorporate them into our data architecture and processes. Optimize data processing workflows and infrastructure for performance, scalability, and cost-effectiveness. Provide technical guidance and foster a culture of continuous learning and improvement. Implement and automate CI/CD pipelines for data engineering workflows, including testing, deployment, and monitoring. Perform migration to production deployment from lower environments, test & validate Must Have Skills Must be certified in any ETL tools, Database, Cloud.(Snowflake certified is more preferred) Must have implemented at least 3 end-to-end projects in Data Engineering. Must have worked on performance management optimization and tuning for data loads, data processes, data transformation in big data Must be flexible to write code using JAVA/Scala/Python etc. as required Must have implemented CI/CD pipelines using tools like Jenkins, GitLab CI, or AWS CodePipeline. Must have managed a team technically of min 5 members and guided the team technically. Must have the Technical Ownership capability of Data Engineering delivery. Strong communication capabilities with client facing. Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5 years of experience in software engineering or a related role, with a strong focus on Any ETL Tool, database, integration. Proficiency in Any ETL tools like Talend , Informatica etc for Data Integration for building and orchestrating data pipelines. Hands-on experience with relational databases such as MySQL, PostgreSQL, or Oracle, and NoSQL databases such as MongoDB, Cassandra, or Redis. Solid understanding of database design principles, data modeling, and SQL query optimization. Experience with data warehousing, Data Lake , Delta Lake concepts and technologies, data modeling, and relational databases.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

New Delhi, Chennai, Bengaluru

Hybrid

Naukri logo

Your day at NTT DATA Senior GenAI Data Engineer We are seeking an experienced Senior Data Engineer to join our team in delivering cutting-edge Generative AI (GenAI) solutions to clients. The successful candidate will be responsible for designing, developing, and deploying data pipelines and architectures that support the training, fine-tuning, and deployment of LLMs for various industries. This role requires strong technical expertise in data engineering, problem-solving skills, and the ability to work effectively with clients and internal teams. What you'll be doing Key Responsibilities: Design, develop, and manage data pipelines and architectures to support GenAI model training, fine-tuning, and deployment Data Ingestion and Integration: Develop data ingestion frameworks to collect data from various sources, transform, and integrate it into a unified data platform for GenAI model training and deployment. GenAI Model Integration: Collaborate with data scientists to integrate GenAI models into production-ready applications, ensuring seamless model deployment, monitoring, and maintenance. Cloud Infrastructure Management: Design, implement, and manage cloud-based data infrastructure (e.g., AWS, GCP, Azure) to support large-scale GenAI workloads, ensuring cost-effectiveness, security, and compliance. Write scalable, readable, and maintainable code using object-oriented programming concepts in languages like Python, and utilize libraries like Hugging Face Transformers, PyTorch, or TensorFlow Performance Optimization: Optimize data pipelines, GenAI model performance, and infrastructure for scalability, efficiency, and cost-effectiveness. Data Security and Compliance: Ensure data security, privacy, and compliance with regulatory requirements (e.g., GDPR, HIPAA) across data pipelines and GenAI applications. Client Collaboration: Collaborate with clients to understand their GenAI needs, design solutions, and deliver high-quality data engineering services. Innovation and R&D: Stay up to date with the latest GenAI trends, technologies, and innovations, applying research and development skills to improve data engineering services. Knowledge Sharing: Share knowledge, best practices, and expertise with team members, contributing to the growth and development of the team. Requirements: Bachelors degree in computer science, Engineering, or related fields (Master's recommended) Experience with vector databases (e.g., Pinecone, Weaviate, Faiss, Annoy) for efficient similarity search and storage of dense vectors in GenAI applications 5+ years of experience in data engineering, with a strong emphasis on cloud environments (AWS, GCP, Azure, or Cloud Native platforms) Proficiency in programming languages like SQL, Python, and PySpark Strong data architecture, data modeling, and data governance skills Experience with Big Data Platforms (Hadoop, Databricks, Hive, Kafka, Apache Iceberg), Data Warehouses (Teradata, Snowflake, BigQuery), and lakehouses (Delta Lake, Apache Hudi) Knowledge of DevOps practices, including Git workflows and CI/CD pipelines (Azure DevOps, Jenkins, GitHub Actions) Experience with GenAI frameworks and tools (e.g., TensorFlow, PyTorch, Keras) Nice to have: Experience with containerization and orchestration tools like Docker and Kubernetes Integrate vector databases and implement similarity search techniques, with a focus on GraphRAG is a plus Familiarity with API gateway and service mesh architectures Experience with low latency/streaming, batch, and micro-batch processing Familiarity with Linux-based operating systems and REST APIs Location: Delhi or Bangalore Workplace type : Hybrid Working

Posted 2 weeks ago

Apply

6.0 - 10.0 years

18 - 30 Lacs

Noida

Work from Office

Naukri logo

Role Overview: We are seeking an experienced Data Protection and Privacy Manager, who will be responsible for overseeing the organization's data protection strategy and ensuring compliance with Indian laws such as the Information Technology Act, 2000, and the Personal Data Protection Bill.The DPPM and work closely with DPO and with regulatory authorities to ensure that sensitive data is processed securely and ethical. This role is crucial in safeguarding our digital assets and maintaining compliance with industry standards and law. Key Responsibilities: Legal Compliance: Ensure adherence to data protection laws, including local DPDPA, IT Act and international regulations like GDPR. Policy Development: Formulate and implement data protection policies and guidelines across the organization. Data Breach Management: Investigate and report data breaches to relevant authorities within the stipulated timeframe. Training & Awareness: Conduct training sessions for employees on data protection practices and raise awareness about privacy policies. Impact Assessments: Perform Data Protection Impact Assessments (DPIA) to identify risks and recommend mitigation strategies. Record Maintenance: Maintain detailed records of data processing activities to meet legal obligations. Grievance Redressal: Act as the point of contact for data principals (individuals) for grievances related to data handling or privacy violations. Recordkeeping: Maintain records of all data processing activities and policies for audit and regulatory purposes. Liaison: With regulatory agencies and stakeholders regarding data protection matters. Qualifications and Experience: Bachelor's degree in Information Security, Computer Science, and Law. Certification in Data Protection or Privacy Management is mandatory, like - CDPP, CDPO, CIPM, CDPSE and DCPP. 8-10 years of experience in security management. Strong understanding of IT infrastructure , data security best practices, and frameworks. Familiarity with regulatory requirements and compliance standards (e.g., RBI, SEBI). Excellent communication, interpersonal, analytical and leadership skills. Knowledge of emerging technologies, their impact on data protection and ability to handle sensitive information discreetly.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 27 Lacs

Chennai

Hybrid

Naukri logo

Key Responsibilities: Technical Skills: Strong proficiency in SQL for data manipulation and querying. Knowledge of Python scripting for data processing and automation. Experience in Reltio Integration Hub (RIH) and handling API-based integrations. Familiarity with Data Modelling Matching, Survivorship concepts and methodologies. Experience with D&B, ZoomInfo, and Salesforce connectors for data enrichment. Understanding of MDM workflow configurations and role-based data governance Experience with AWS Databricks, Data Lake and Warehouse Implement and configure MDM solutions using Reltio while ensuring alignment with business requirements and best practices. Develop and maintain data models, workflows, and business rules within the MDM platform. Work on Reltio Workflow (DCR Workflow & Custom Workflow) to manage data approvals and role-based assignments. Support data integration efforts using Reltio Integration Hub (RIH) to facilitate data movement across multiple systems. Develop ETL pipelines using SQL, Python, and integration tools to extract, transform, and load (ETL) data. Work with D&B, ZoomInfo, and Salesforce connectors for data enrichment and integration. Perform data analysis and profiling to identify data quality issues and recommend solutions for data cleansing and enrichment. Collaborate with stakeholders to define and document data governance policies, procedures, and standards. Optimize MDM workflows to enhance data stewardship and governance.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Noida, Chennai

Hybrid

Naukri logo

Deployment, configuration & maintenance of Databricks clusters & workspaces Security & Access Control Automate administrative task using tools like Python, PowerShell &Terraform Integrations with Azure Data Lake, Key Vault & implement CI/CD pipelines Required Candidate profile Azure, AWS, or GCP; Azure experience is preferred Strong skills in Python, PySpark, PowerShell & SQL Experience with Terraform ETL processes, data pipeline &big data technologies Security & Compliance

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies