Home
Jobs

1361 Data Governance Jobs - Page 6

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

55 - 60 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Position Overview We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities Data Architecture & Modeling Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) Create and maintain data lineage documentation and data dictionaries for healthcare datasets Establish data modeling standards and best practices across the organization Technical Leadership Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica Architect scalable data solutions that handle large volumes of healthcare transactional data Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) Design data models that support analytical, reporting and AI/ML needs Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality Implement data governance frameworks specific to healthcare data privacy and security requirements Establish data quality monitoring and validation processes for critical health plan metrics Lead eUorts to standardize healthcare data definitions across multiple systems and data sources. Required Qualifications Technical Skills 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data Experience with healthcare data standards and medical coding systems Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication Proven track record of leading data modeling projects in complex healthcare environments Strong analytical and problem-solving skills with ability to work with ambiguous requirements Excellent communication skills with ability to explain technical concepts to business stakeholders Experience mentoring team members and establishing technical standards. Preferred Qualifications Experience with Medicare Advantage, Medicaid, or Commercial health plan operations Cloud platform certifications (AWS, Azure, or GCP) Experience with real-time data streaming and modern data lake architectures Knowledge of machine learning applications in healthcare analytics Previous experience in a lead or architect role within healthcare organizations. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 2 days ago

Apply

5.0 - 8.0 years

25 - 27 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Expertise in Informatica DGDQ tools for data quality and governance management Strong knowledge of SQL and data modeling techniques Experience with data profiling, cleansing, validation, and enrichment Knowledge of data governance best practices, including data stewardship and metadata management Experience working with large datasets and complex data environments Familiarity with data security, compliance, and regulatory requirements

Posted 2 days ago

Apply

1.0 - 6.0 years

7 - 16 Lacs

Mumbai

Hybrid

Naukri logo

Initial technical assistance, troubleshooting, documenting, managing, maintaining & supporting the MI. Hands-on exp in Data Steward interface, basic data validation & analysis Log and track issues. Troubleshoot and resolve common Power BI issues. Required Candidate profile Min. 1yr Exp in MI Power BI , Power Query, DAX, SQL, ETL, Dashboards, Agile Data Steward. Data Governance, Data visualization, Databases (SQL, Azure,) Strong Technical & troubleshooting knowledge.

Posted 2 days ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Mumbai

Hybrid

Naukri logo

Key skills required: Strong SQL expertise Hands-on experience with Advent Geneva and dataset setup Excellent communication skills to work across teams and functions

Posted 2 days ago

Apply

8.0 - 12.0 years

25 - 35 Lacs

Mumbai

Work from Office

Naukri logo

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 2 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Mumbai

Work from Office

Naukri logo

Primary responsibilities: Responsible for ensuring there is an up-to-date and verified database of all Assets and Configuration Items (CIs) throughout the IT Service Management Lifecycle. Apply a continuous improvement approach to identifying and tracking company assets, CIs and their dependencies within the Configuration Management Database (CMDB) Collaborate with key stakeholders, internal teams (such as Applications and Change and Release Mgt teams) and external vendors for the introduction and retirement of assets through changes, releases, upgrades, new technology deployments and acquisitions. Supporting the Head of Asset and Configuration to manage Data Governance Framework, monitoring the quality of updates to the CMDB (single source of truth) which is integrated with our Enterprise Architecture Platform solution (BizzDesign) and downstream processes. Establish and maintain relationships between infrastructure, applications, and business services and capabilities through the CMDB/CSDM. Work closely with IT teams to identify and resolve issues related to configuration and asset management. Generate and analyse reports to identify trends and areas for improvement. Collaborate with other IT teams to implement and maintain best practices for configuration and asset management. Monitor and enforce compliance with configuration and asset management policies and procedures. Providing guidance and support to other IT teams on configuration and asset management-related issues. Supporting the process owner and process manager and providing inputs in the creation of principles, processes, and procedures Supporting the definition of the structure of the configuration management system, including CI types, naming conventions, required and optional attributes and relationships. Proposing scope for service asset and configuration management and performing configuration audits. Requirements Proven experience in asset and configuration management using the ServiceNow platform, with a focus on ServiceNow Discovery, Graph Connectors, and Multisource CMDB. ITAM, CAMP, CITAM or other accreditation in Asset and Configuration Management discipline Strong attention to detail, ability to process data and work in a complex global organisational environment. Extremely strong organization and productivity skills, ability to interface with managers, staff, and stakeholders within the organization. Proven experience with ITSM tools such as Service Now Proven experience and excellent knowledge of Cloud technologies, Physical and Virtual Infrastructure Excellent verbal and written presentation and communication skills.

Posted 3 days ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Bengaluru, Karnataka

Work from Office

Naukri logo

We are seeking a highly skilled GCP Data Engineer with experience in designing and developing data ingestion frameworks, real-time processing solutions, and data transformation frameworks using open-source tools. The role involves operationalizing open-source data-analytic tools for enterprise use, ensuring adherence to data governance policies, and performing root-cause analysis on data-related issues. The ideal candidate should have a strong understanding of cloud platforms, especially GCP, with hands-on expertise in tools such as Kafka, Apache Spark, Python, Hadoop, and Hive. Experience with data governance and DevOps practices, along with GCP certifications, is preferred.

Posted 3 days ago

Apply

7.0 - 11.0 years

9 - 11 Lacs

Mumbai, Indore, Hyderabad

Work from Office

Naukri logo

We're Hiring: Data Governance LeadLocations:Offices in Austin (USA), Singapore, Hyderabad, Indore, Ahmedabad (India)Primary Job Location: Hyderabad / Indore / Ahmedabad (Onsite Role) Compensation Range: Competitive | Based on experience and expertise To Apply, Share Your Resume With:Current CTCExpected CTCNotice PeriodPreferred Location What You Will Do Role Overview AKey Responsibilities 1 Governance Strategy & Stakeholder EnablementDefine and drive enterprise-level data governance frameworks and policies Align governance objectives with compliance, analytics, and business priorities Work with IT, Legal, Compliance, and Business teams to drive adoption Conduct training, workshops, and change management programs 2 Microsoft Purview Implementation & AdministrationAdminister Microsoft Purview: accounts, collections, RBAC, and scanning policies Design scalable governance architecture for large-scale data environments (>50TB) Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, and Snowflake 3 Metadata & Data Lineage ManagementDesign metadata repositories and workflows Ingest technical/business metadata via ADF, REST APIs, PowerShell, Logic Apps Validate end-to-end lineage (ADF Synapse Power BI), impact analysis, and remediation 4 Data Classification & SecurityImplement and govern sensitivity labels (PII, PCI, PHI) and classification policies Integrate with Microsoft Information Protection (MIP), DLP, Insider Risk, and Compliance Manager Enforce lifecycle policies, records management, and information barriers Working knowledge of GDPR, HIPAA, SOX, CCPA Strong communication and leadership to bridge technical and business governance

Posted 3 days ago

Apply

11.0 - 15.0 years

10 - 15 Lacs

Indore, Ahmedabad

Work from Office

Naukri logo

Education Required: Bachelors / Masters / PhD: Bachelor s or master s in computer science, Statistics, Mathematics, Data Science, Engineering Must have skills: visualization tools such as Power BI, Tableau, or Looker, ETL, Azure data factory, SQL Stored procedure, performance tuning Good to have skills: statistical modeling, predictive analytics, or machine learning Expectation : We are seeking a detail-oriented and analytical experienced Data Engineer. The ideal candidate will be responsible for collecting, analyzing, and interpreting large datasets across various data sources to drive business decisions and strategy. This role requires strong technical skills, a passion for data, and the ability to communicate actionable insights through reporting. Collect, clean, and validate data from various sources to ensure accuracy and consistency. Analyze large datasets to identify trends, patterns, and actionable insights that support business objectives. Develop and maintain dashboards and reports using visualization tools such as Power BI, Tableau, or Looker. Collaborate with stakeholders to understand business needs and translate them into data-driven solutions. Present findings, insights, and recommendations to both technical and non-technical audiences in a clear and concise manner. Monitor key performance indicators (KPIs) and provide regular updates to business units. Support data governance and data quality initiatives to enhance overall data integrity. Preferred Skills: Experience in statistical modeling, predictive analytics, or machine learning. Familiarity with cloud-based data platforms (eg, Azure). Ability to work in a fast-paced environment and manage multiple priorities

Posted 3 days ago

Apply

5.0 - 9.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Date 20 Jun 2025 Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Could you be the full-time Documentation Management Specialist in our dynamic Project/Program team were looking for Your future role Implement Documentation Management within the Project/Program/Bid organization (in Leading Unit and coordinate in all Participating Units). Monitor complete Project/ Program/Bid Documentation and execute Documentation Life Cycle with the Documentation Master List (DML) Support the Project/Program/Bid Documentation Manager (PrDM) of the relevant Project/Program/Bid Check internal validation (AVVA) of documents before submission to Customer. Accountable that the Project Documentation is properly archived Proactive role in the Project / Program / Bid in terms of implementation of documentation culture. Key accountabilities Applies the rules defined in the Documentation Management Plan (DMP), Metadata Controls documentation standardstemplate, reference, version, Define and Deploy the Documentation numbering system Define and Deploy the Project Working documents single repository (in Teams, Sharepoint, etc...) Controls contractual documentation submissions according to customer requirements, Assigns reference, when needed, to the documentation produced and controls the electronic files identification and customer identification (if relevant), Checks the identification used by entities and partners, Records in the EDMS all technical and management documentation from external entities, Makes sure internal validation of Project/Program documents (AVVA) is done before submission to Customer . Completes/updates the DML (Documentation Master List) with data collected during the project and prepares the CDL. Prepares and records documents submission to Customer/Partners, Sends documentation according to internal distribution list, Provides inputs related to documentation dashboards and reports Follows Customer and stakeholders documentation status in DML. Updates the document contractual templates according to contractual requirements, In applying internal rules/instructions, records the Document translation and physical archiving. Prepares and participates in sub-system Gate Review and coordinates with the PrDM. Suggests improvements of the activity (REX on documentation issuesprocess, organisation) Performance measurements Documentation Management KPIs (Quality, Delivery of in contractual documentation in due time) Perform overview about documentation milestones in general, Billing milestones, penalties Adherence to process, measured by process inspections Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.

Posted 3 days ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Kolkata

Work from Office

Naukri logo

Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 days ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

enior SAP SAC Consultant - Job Responsibilities Solution Design & Architecture Lead the design and architecture of SAP Analytics Cloud (SAC) solutions aligned with business objectives. Translate complex business requirements into technical SAC models and dashboards. Define data architecture, models (live/acquired), and connectivity with SAP and non-SAP systems (e.g., BW/4HANA, S/4HANA, HANA, SQL, etc.). Dashboard & Story Development Develop interactive and visually compelling SAC stories and dashboards using advanced scripting and calculation capabilities. Customize UI/UX using SAC features like widgets, charts, filters, and responsive pages. Data Modeling & Integration Design and build data models within SAC and integrate external datasets as needed. Ensure high performance and accuracy through optimized data transformations and blending. Configure and manage data import/export jobs and schedules. Advanced Analytics & Planning Utilize SAC s predictive capabilities, Smart Insights, and Smart Discovery to provide actionable insights. Implement and manage planning scenarios, input forms, allocation logic, and forecast models (if applicable). Stakeholder Collaboration Act as the key point of contact between business users and IT, gathering requirements and providing best-practice solutions. Conduct workshops, training sessions, and end-user support activities. Performance Optimization & Governance Optimize SAC reports and stories for performance and usability. Enforce data governance, security roles, and access controls within SAC. Project Management & Leadership Lead end-to-end project lifecycle for SAC implementations, upgrades, and enhancements. Mentor junior consultants and provide technical guidance to cross-functional teams. Documentation & Compliance Prepare technical documentation, user guides, and test scripts. Ensure compliance with internal data security and external regulatory standards. Innovation & Continuous Improvement Stay current with SAC updates, roadmap features, and SAP BTP innovations. Proactively suggest improvements to enhance analytics maturity and value delivery.

Posted 3 days ago

Apply

4.0 - 7.0 years

11 - 16 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Manager Business Intelligence & Analytics Home Job Openings Manager Business Intelligence & Analytics Analyze and interpret large sets of data to provide valuable insights and strategic recommendations to the organization. Involves leveraging data to drive strategic decision-making, optimizing business processes, and enabling the organization to gain a competitive edge through data-driven insights. 1. Data Analysis: Prepare dash boards on Microsoft BI tool. Collaborating with teams to collect, clean, and analyze data from various sources to identify trends, patterns, and correlations. 2. Reporting and Visualization: Creating reports, dashboards, and visualizations to present data-driven insights in a clear and concise manner to management. 3. Business Intelligence Strategy: Developing and implementing a comprehensive business intelligence strategy that aligns with the organization s goals and objectives. 4. Data-driven Decision Making: Assisting senior management in making informed decisions by providing data-backed recommendations and insights. 5. Cross-functional Collaboration: Working closely with other departments such as marketing, finance, product development, and IT to understand their data needs and provide actionable insights to support their objectives. 6. Data Governance: Ensuring data quality, integrity, and security by establishing and enforcing data governance policies and procedures. 7. Emerging Technologies: Staying updated with the latest trends and advancements in business intelligence tools, data analytics techniques, and data visualization platforms to enhance the team s capabilities. 8. Team Leadership: Rolling out monthly campaigns for all countries to be able to get promotions from hotels & activities to improve profitability. Negotiate overrides with various hotels, activities , excursions and third party suppliers to improve sales and margins. Required Qualifications: Bachelor s degree in relevant field such as Business Adminitration, Information Systems or related discipline. Strong analytical skills are essential for gathering and interpreting information to identify trends. Proficiency in using data analysis tools. Ability to visualize data effectivetly using visualisation tools like Power BI. Familiarity with business intelligence concepts and methodologies and data governance. Travel Industry Knowledge Excellent written and verbal communication skills are essential Desired Qualifications: Experience in forecasting and predictive modelling. Data mining and machine learning techniques Experience in designing and creating interactive reports and dashboards and visualizations that effectively communicate data insights to various stakeholders Knowledge of data privacy & security. Job Type: Full Time Job Location: India

Posted 3 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Naukri logo

IICS Developer1 Job Overview :We are seeking an experienced IICS (Informatica Intelligent Cloud Services) Developer with hands-on experience in the IICS platform . The ideal candidate must have strong knowledge of Snowflake and be proficient in building and managing integrations between different systems and databases. The role will involve working with cloud-based integration solutions, ensuring data flows seamlessly across platforms, and optimizing performance for large-scale data processes. Key Responsibilities : Design, develop, and implement data integration solutions using IICS (Informatica Intelligent Cloud Services). Work with Snowflake data warehouse solutions, including data loading, transformation, and querying. Build, monitor, and maintain efficient data pipelines between cloud-based systems and Snowflake. Troubleshoot and resolve integration issues within the IICS platform and Snowflake . Ensure optimal data processing performance and manage data flow between various cloud applications and databases. Collaborate with data architects, analysts, and stakeholders to gather requirements and design integration solutions. Implement best practices for data governance, security, and data quality within the integration solutions. Perform unit testing and debugging of IICS data integration tasks. Optimize integration workflows to ensure they meet performance and scalability needs. Key Skills : Hands-on experience with IICS (Informatica Intelligent Cloud Services) . Strong knowledge and experience working with Snowflake as a cloud data warehouse. Proficient in building ETL/ELT workflows , including integration of various data sources into Snowflake . Experience with SQL and writing complex queries for data transformation and manipulation. Familiarity with data integration techniques and best practices for cloud-based platforms. Experience with cloud integration platforms and working with RESTful APIs and other integration protocols. Ability to troubleshoot, optimize, and maintain data pipelines effectively. Knowledge of data governance, security principles, and data quality standards. Qualifications : Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent experience). Minimum of 5 years of experience in data integration development. Proficiency in Snowflake and cloud-based data solutions. Strong understanding of ETL/ELT processes and integration design principles. Experience working in Agile or similar development methodologies. Location - Pune,Hyderabad,Kolkata,Jaipur,Chandigarh

Posted 3 days ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Hosur

Work from Office

Naukri logo

Job : Company NameTitan Job TitleTEAL-Master Data Operations & Internal Audit Job TypeRegular/ Job CategoryAerospace and Defence DepartmentSupply Chain Management LocationHosur, Tamil Nadu, India Titan, a leading company in the Aerospace and Defence industry, is seeking a highly skilled and experienced individual to join our team as a TEAL-Master Data Operations & Internal Audit. This is a regular/ permanent position within our Supply Chain Management department, based in Hosur, Tamil Nadu, India. Key Responsibilities: - Manage and maintain all master data related to supply chain operations, including but not limited to material master, vendor master, and customer master. - Ensure accuracy and completeness of master data, and identify and resolve any discrepancies or issues. - Develop and implement data governance policies and procedures to maintain data integrity. - Collaborate with cross-functional teams to identify and implement process improvements related to master data management. - Conduct regular audits of master data to ensure compliance with company standards and industry regulations. - Provide support and training to team members on master data management processes and systems. - Stay updated on industry trends and best practices in master data management. Qualifications: - Bachelor's degree in Supply Chain Management, Business Administration, or a related field. - Minimum of 5 years of experience in master data management, preferably in the Aerospace and Defence industry. - Strong understanding of supply chain operations and processes. - Experience with ERP systems and master data management tools. - Excellent analytical and problem-solving skills. - Attention to detail and ability to work with large sets of data. - Strong communication and interpersonal skills. - Ability to work independently and in a team environment. If you are a self-motivated and detail-oriented individual with a passion for data management and a background in supply chain operations, we encourage you to apply for this exciting opportunity at Titan. Join our dynamic team and be a part of our mission to deliver high-quality products to our customers in the Aerospace and Defence industry. Work Experience Degree / Diploma / Engineering7+ years,Communication, Team Work, Strategy, Logical Decision making. Presentation, SAP- PP and MM, MS Office, Business Process.

Posted 3 days ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Chennai

Work from Office

Naukri logo

JD for Power Bi. Key Responsibilities : Lead BI Strategy : Drive the business intelligence strategy, helping clients or internal teams leverage Power BI and other BI tools to turn complex data into actionable insights. Power BI Implementation & Architecture : Design and oversee the implementation of Power BI solutions, from data modeling and ETL processes to reporting and dashboard creation. Data Governance & Quality : Ensure best practices for data governance, integrity, and consistency across BI initiatives. Mentorship & Leadership : Provide mentorship to junior and mid-level BI professionals, guiding them through technical challenges and career development. Cross-Functional Collaboration : Work closely with executive leadership, business analysts, data engineers, and other stakeholders to understand business requirements and translate them into technical solutions. Advanced Analytics : Design advanced analytics solutions using DAX, Power Query, and other Power BI tools to solve complex business problems. Performance Optimization : Optimize Power BI reports and dashboards to handle large datasets, ensuring high performance and responsiveness. Reporting & Insights : Deliver impactful reports and dashboards that help stakeholders make data-driven decisions, ensuring clear communication of findings. Strategic Advice : Offer expert advice on BI tools, methodologies, and best practices, contributing to high-level strategic decisions related to data-driven transformations. Innovation & Research : Stay up to date with the latest Power BI features and updates, and incorporate new techniques and technologies to continuously improve BI practices.

Posted 3 days ago

Apply

13.0 - 18.0 years

20 - 25 Lacs

Gurugram

Work from Office

Naukri logo

0px> Who are we? In one sentence We are seeking a GenAI Architect & People Manager with strong technical depth and leadership capabilities to lead our GenAI initiatives. The ideal candidate will possess a robust understanding of Generative AI , Machine Learning , and Cloud-based solution delivery , combined with proven experience in managing high-performing technical teams. This role requires a visionary who can translate business challenges into scalable AI solutions while nurturing talent and fostering innovation. What will your job look like? Lead the design and implementation of GenAI and ML-based solutions across business use cases. Translate business requirements into technical architectures using Azure/AWS Cloud Platforms . Manage and mentor a multidisciplinary team of engineers, data scientists, and ML specialists . Drive adoption of Databricks , PySpark , and Java-based frameworks within solution development. Collaborate closely with product owners, data engineering teams, and business stakeholders. Ensure high standards in code quality, system performance, and model governance . Track industry trends and continuously improve the GenAI strategy and roadmap. Oversee end-to-end lifecycle: use case identification, PoC, MVP, production deployment, and support. Define and monitor KPIs to measure team performance and project impact. All you need is... 13+ years of overall IT experience with a strong background in Telecom domain (preferred). Proven hands-on experience with GenAI technologies and Machine Learning pipelines . Strong understanding of LLMs , Prompt Engineering , RAG (Retrieval-Augmented Generation) , and fine-tuning. Demonstrated experience in building and deploying GenAI use cases on Azure or AWS . Strong expertise in Databricks , PySpark , and Java . In-depth understanding of cloud-native architecture , microservices , and data pipelines . Solid people management experience: team building, mentoring, performance reviews. Strong analytical thinking and communication skills. Good to Have Skills: Familiarity with LangChain, HuggingFace Transformers , and Vector Databases (like FAISS, Pinecone). Experience with Data Governance , MLOps , and CI/CD for AI/ML models. Certification in Azure/AWS (e.g., Azure AI Engineer, AWS Certified Machine Learning). Exposure to NLP , speech models , or multimodal AI . Why you will love this job: You will be challenged with leading and mentoring a few development teams & projects You will join a strong team with lots of activities, technologies, business challenges and a progression path You will have the opportunity to work with the industry most advanced technologies

Posted 3 days ago

Apply

14.0 - 19.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Join our Team About this opportunity: As an AI Architect, you will be responsible for designing and delivering scalable, secure, and innovative AI architectures aligned with enterprise and client needs. You will lead the continuous enhancement of AI platforms by enabling new features and capabilities that drive AI adoption and deliver measurable business value through practical use case implementations. Your role will be pivotal in shaping AI strategy, operationalizing AI solutions, and fostering strong client relationships. What you will do: AI Platform Enhancement & Innovation: Lead the evaluation, integration, and enablement of new AI platform features and technologies to continuously evolve AI capabilities and maintain competitive advantage. Use Case Identification & Implementation: Collaborate with business stakeholders to identify high-impact AI use cases, design tailored solutions, and oversee end-to-end delivery ensuring alignment with strategic objectives. Architecture Design & Governance: Develop and maintain comprehensive AI architectural blueprints and standards that ensure scalability, security, compliance, and interoperability within enterprise IT landscapes. Operationalization & MLOps: Architect AI/ML model lifecycle management solutions, including data pipelines, model training, deployment, monitoring, and governance, leveraging cloud and hybrid environments. Stakeholder Leadership: Lead cross-functional design workshops, gain stakeholder buy-in, and act as the principal technical escalation point for AI-related challenges. Security & Compliance: Collaborate with IT security and data governance teams to embed privacy, ethical AI principles, and compliance into AI solution architectures. Risk Management: Identify and mitigate AI-specific risks such as model bias, data privacy issues, and system vulnerabilities. Pre-Sales & Strategy: Partner with sales and business development teams to translate customer requirements into compelling AI solution architectures and proposals. The skills you bring: Proven experience in AI platform architecture, including enhancement, feature enablement, and integration of new AI technologies. Strong expertise in AI/ML frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) and cloud AI platforms such as AWS SageMaker, Azure AI, or Google AI Platform. Demonstrated ability to architect and operationalize AI pipelines and MLOps solutions in cloud and hybrid environments. Proficient in AI security, privacy, and ethical considerations, ensuring compliant and responsible AI deployments. Experience leading technical workshops, managing stakeholder expectations, and driving consensus on AI designs. Strong programming background and familiarity with container technologies (Docker, Kubernetes). Excellent communication skills to articulate complex AI concepts to both technical and non-technical audiences. Education & Experience Bachelor s or Master s degree in Computer Science, Artificial Intelligence, Data Science, or related field. 14+ years of IT experience with at least 7+ years focused on AI architecture, AI platform development, and solution delivery. Hands-on experience with AI model development, deployment, and lifecycle management. Proven track record of driving AI adoption through platform enhancements and use case implementations in enterprise settings. Why join Ericsson? What happens once you apply? Primary country and city: India (IN) || Bangalore Req ID: 768758

Posted 3 days ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Minimum of 5 years of exposure to Camunda version 8 design and development Hands on experience in Full stack development especially Java, Springboot, Micro-services, React JS/Angular Expertise in DevOps, CI-CD, Kubernetes, Terraform, Helm chart and EKS Experience with cloud infrastructure platforms (AWS preferred) and setting best practices, compliance, and data governance Integration experience using REST / JSON Knowledge on QA and automation Should be able to Define IT strategy, E2E solutioning, architecting and presenting it to wider audience Participating and planning scrum calls and track updates for each PODs Mentor and guide junior team members, fostering a culture of knowledge sharing and continuous learning.

Posted 3 days ago

Apply

7.0 - 12.0 years

15 - 17 Lacs

Hyderabad, Pune, Chennai

Work from Office

Naukri logo

Job Title: Senior Data Analyst - Data Governance About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Description: Mandate Skills : Data Analysis + Data Governance, Collibra, Python Location : Bangalore, Pune, Chennai, Hyderabad Notice: immediate to 30 Days Level: M3/M4 (7+ Years) Job Description 1.Design, document and advise on implementing Data Discovery and Data 2.Control Fix for a premier global bank in the wealth and personal banking segment extensively using Collibra 3.Responsible for updating and maintaining process metadata along with critical data elements, preferred business glossary and respective technical metadata for critical global services from various regions in the DG 4.Understand functions of various enterprise information management applications, map the data lineage of data elements along with the flow-types and consumption status 5.Work with data quality team and establish proactive data quality controls by implementing a strong and scalable governance process 6.Create and promote the use of common data assets, such as business glossaries, reference data, data inventories, data models and data catalogs within the organization thereby improving awareness about Data Governance 7.Monitor adherence to data policies and standards, governing potential policy deviations and escalating where necessary 8.Establish data quality standards, procedures and protocols to ensure the accuracy, completeness, and consistency of data across the organization 9.Assist in the implementation of data classification processes to protect sensitive information appropriately

Posted 3 days ago

Apply

7.0 - 12.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Minimum 7 years of relevant experience in Information Security, Data Governance, or Compliance roles. Manage Symantec DLP infrastructure (Network, Endpoint, and Cloud components). Maintain and migrate the DLP policies, rules, from Symantec DLP to Microsoft purview as per business needs. Configure and manage Microsoft Purview Information Protection & Data Governance policies including: Sensitivity labels and auto-labeling Insider risk management Data lifecycle policies Implement Microsoft Purview eDiscovery, Audit, and Compliance Manager solutions. Collaborate with Security, Legal, and Compliance teams to ensure M365 data compliance posture. Define and implement data retention schedules in alignment with legal, regulatory, and business requirements. Lead the implementation of archiving solutions (e.g., Microsoft Exchange Online Archiving, Azure Information Protection, third-party tools). Coordinate with Records Management and Legal teams to maintain defensible deletion and audit readiness. Support migrations and lifecycle management for legacy data stores. Hands-on expertise with Microsoft Purview, Microsoft 365 Security & Compliance Center. Strong understanding of data classification, encryption, auditing, and compliance standards (e.g., GDPR, HIPAA, SOX). .

Posted 3 days ago

Apply

5.0 - 8.0 years

25 - 30 Lacs

Prayagraj, Varanasi, Ghaziabad

Work from Office

Naukri logo

Does working for 150+ million children of Bharat excite you? Then this opportunity is for you! About us: We are a leading Conversational AI company that s revolutionizing education for millions worldwide. Our knowledge bots are already empowering 35 million users, and were at the forefront of shaping the future of EdTech in Naya Bharat. Were creating an omniverse in Conversational AI, where developers collaborate to innovate together. As part of our team, youll have a pivotal role in turning complex educational data into practical insights that drive real change. Were deeply committed to enhancing education for 150 million children in India, partnering with state departments and supporting national initiatives like Vidhya Samiksha Kendra under the National Education Policy 2020. ConveGenius operates across three divisions : ConveGenius Digital uses AI and bots to make systemic improvements, ConveGenius Edu offers Swift PAL tablets and AR-enhanced learning, and ConveGenius Insights leads global research in educational science. If you re passionate about making a meaningful impact in education, have experience in both business and social sectors, and thrive in fast-paced environments, join us in transforming EdTech for Naya Bharat. Embrace our startup culture, where innovation and determination reshape India s educational future. Learn more about us: https://linktr.ee/convegenius11 Key Responsibilities: Design, develop, and maintain data pipelines and ETL processes to efficiently ingest, transform, and load data from various sources into data warehouses and data lakes. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and design data models that facilitate efficient data retrieval and analysis. Optimize data pipeline performance, ensuring scalability, reliability, and data integrity. Implement data governance and security measures to ensure compliance with data privacy regulations and protect sensitive information. Identify and implement appropriate tools and technologies to enhance data engineering capabilities and automate processes. Conduct thorough testing and validation of data pipelines to ensure data accuracy and quality. Monitor and troubleshoot data pipelines to identify and resolve issues, ensuring minimal downtime. Develop and maintain documentation, including data flow diagrams, technical specifications, and user guides. Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. A masters degree is a plus. Proven experience as a Data Engineer or in a similar role, with a strong understanding of data engineering concepts, practices, and tools. Proficiency in programming languages such as Python, Java, or Scala, and experience with data manipulation and transformation frameworks/libraries (e.g., Apache Spark, Pandas, SQL). Solid understanding of relational databases, data modeling, and SQL queries. Experience with distributed computing frameworks, such as Apache Hadoop, Apache Kafka, or Apache Flink. Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and experience with cloud-based data engineering services (e.g., Amazon Redshift, Google BigQuery, Azure Data Factory). Familiarity with data warehousing concepts and technologies (e.g., dimensional modeling, columnar databases). What We Offer & Benefits: At ConveGenius, we believe in creating a supportive and dynamic work environment where you can thrive professionally and personally. If you re passionate about making a difference in education and enjoy working in a diverse and inclusive setting, ConveGenius is the place for you! Experience working with a diverse team of professionals located throughout India. Be part of an organization that operates in over two-thirds of Indias states. Play a crucial role in transforming the education sector in India. Enjoy the security and peace of mind that comes with health insurance coverage. Benefit from a flexible leave policy, including special provisions for period leaves.

Posted 3 days ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 days ago

Apply

3.0 - 5.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 days ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies