Home
Jobs

1376 Data Governance Jobs - Page 7

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

9 - 12 Lacs

Kolkata

Work from Office

Naukri logo

Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 days ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

enior SAP SAC Consultant - Job Responsibilities Solution Design & Architecture Lead the design and architecture of SAP Analytics Cloud (SAC) solutions aligned with business objectives. Translate complex business requirements into technical SAC models and dashboards. Define data architecture, models (live/acquired), and connectivity with SAP and non-SAP systems (e.g., BW/4HANA, S/4HANA, HANA, SQL, etc.). Dashboard & Story Development Develop interactive and visually compelling SAC stories and dashboards using advanced scripting and calculation capabilities. Customize UI/UX using SAC features like widgets, charts, filters, and responsive pages. Data Modeling & Integration Design and build data models within SAC and integrate external datasets as needed. Ensure high performance and accuracy through optimized data transformations and blending. Configure and manage data import/export jobs and schedules. Advanced Analytics & Planning Utilize SAC s predictive capabilities, Smart Insights, and Smart Discovery to provide actionable insights. Implement and manage planning scenarios, input forms, allocation logic, and forecast models (if applicable). Stakeholder Collaboration Act as the key point of contact between business users and IT, gathering requirements and providing best-practice solutions. Conduct workshops, training sessions, and end-user support activities. Performance Optimization & Governance Optimize SAC reports and stories for performance and usability. Enforce data governance, security roles, and access controls within SAC. Project Management & Leadership Lead end-to-end project lifecycle for SAC implementations, upgrades, and enhancements. Mentor junior consultants and provide technical guidance to cross-functional teams. Documentation & Compliance Prepare technical documentation, user guides, and test scripts. Ensure compliance with internal data security and external regulatory standards. Innovation & Continuous Improvement Stay current with SAC updates, roadmap features, and SAP BTP innovations. Proactively suggest improvements to enhance analytics maturity and value delivery.

Posted 3 days ago

Apply

4.0 - 7.0 years

11 - 16 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Manager Business Intelligence & Analytics Home Job Openings Manager Business Intelligence & Analytics Analyze and interpret large sets of data to provide valuable insights and strategic recommendations to the organization. Involves leveraging data to drive strategic decision-making, optimizing business processes, and enabling the organization to gain a competitive edge through data-driven insights. 1. Data Analysis: Prepare dash boards on Microsoft BI tool. Collaborating with teams to collect, clean, and analyze data from various sources to identify trends, patterns, and correlations. 2. Reporting and Visualization: Creating reports, dashboards, and visualizations to present data-driven insights in a clear and concise manner to management. 3. Business Intelligence Strategy: Developing and implementing a comprehensive business intelligence strategy that aligns with the organization s goals and objectives. 4. Data-driven Decision Making: Assisting senior management in making informed decisions by providing data-backed recommendations and insights. 5. Cross-functional Collaboration: Working closely with other departments such as marketing, finance, product development, and IT to understand their data needs and provide actionable insights to support their objectives. 6. Data Governance: Ensuring data quality, integrity, and security by establishing and enforcing data governance policies and procedures. 7. Emerging Technologies: Staying updated with the latest trends and advancements in business intelligence tools, data analytics techniques, and data visualization platforms to enhance the team s capabilities. 8. Team Leadership: Rolling out monthly campaigns for all countries to be able to get promotions from hotels & activities to improve profitability. Negotiate overrides with various hotels, activities , excursions and third party suppliers to improve sales and margins. Required Qualifications: Bachelor s degree in relevant field such as Business Adminitration, Information Systems or related discipline. Strong analytical skills are essential for gathering and interpreting information to identify trends. Proficiency in using data analysis tools. Ability to visualize data effectivetly using visualisation tools like Power BI. Familiarity with business intelligence concepts and methodologies and data governance. Travel Industry Knowledge Excellent written and verbal communication skills are essential Desired Qualifications: Experience in forecasting and predictive modelling. Data mining and machine learning techniques Experience in designing and creating interactive reports and dashboards and visualizations that effectively communicate data insights to various stakeholders Knowledge of data privacy & security. Job Type: Full Time Job Location: India

Posted 3 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Naukri logo

IICS Developer1 Job Overview :We are seeking an experienced IICS (Informatica Intelligent Cloud Services) Developer with hands-on experience in the IICS platform . The ideal candidate must have strong knowledge of Snowflake and be proficient in building and managing integrations between different systems and databases. The role will involve working with cloud-based integration solutions, ensuring data flows seamlessly across platforms, and optimizing performance for large-scale data processes. Key Responsibilities : Design, develop, and implement data integration solutions using IICS (Informatica Intelligent Cloud Services). Work with Snowflake data warehouse solutions, including data loading, transformation, and querying. Build, monitor, and maintain efficient data pipelines between cloud-based systems and Snowflake. Troubleshoot and resolve integration issues within the IICS platform and Snowflake . Ensure optimal data processing performance and manage data flow between various cloud applications and databases. Collaborate with data architects, analysts, and stakeholders to gather requirements and design integration solutions. Implement best practices for data governance, security, and data quality within the integration solutions. Perform unit testing and debugging of IICS data integration tasks. Optimize integration workflows to ensure they meet performance and scalability needs. Key Skills : Hands-on experience with IICS (Informatica Intelligent Cloud Services) . Strong knowledge and experience working with Snowflake as a cloud data warehouse. Proficient in building ETL/ELT workflows , including integration of various data sources into Snowflake . Experience with SQL and writing complex queries for data transformation and manipulation. Familiarity with data integration techniques and best practices for cloud-based platforms. Experience with cloud integration platforms and working with RESTful APIs and other integration protocols. Ability to troubleshoot, optimize, and maintain data pipelines effectively. Knowledge of data governance, security principles, and data quality standards. Qualifications : Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent experience). Minimum of 5 years of experience in data integration development. Proficiency in Snowflake and cloud-based data solutions. Strong understanding of ETL/ELT processes and integration design principles. Experience working in Agile or similar development methodologies. Location - Pune,Hyderabad,Kolkata,Jaipur,Chandigarh

Posted 3 days ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Hosur

Work from Office

Naukri logo

Job : Company NameTitan Job TitleTEAL-Master Data Operations & Internal Audit Job TypeRegular/ Job CategoryAerospace and Defence DepartmentSupply Chain Management LocationHosur, Tamil Nadu, India Titan, a leading company in the Aerospace and Defence industry, is seeking a highly skilled and experienced individual to join our team as a TEAL-Master Data Operations & Internal Audit. This is a regular/ permanent position within our Supply Chain Management department, based in Hosur, Tamil Nadu, India. Key Responsibilities: - Manage and maintain all master data related to supply chain operations, including but not limited to material master, vendor master, and customer master. - Ensure accuracy and completeness of master data, and identify and resolve any discrepancies or issues. - Develop and implement data governance policies and procedures to maintain data integrity. - Collaborate with cross-functional teams to identify and implement process improvements related to master data management. - Conduct regular audits of master data to ensure compliance with company standards and industry regulations. - Provide support and training to team members on master data management processes and systems. - Stay updated on industry trends and best practices in master data management. Qualifications: - Bachelor's degree in Supply Chain Management, Business Administration, or a related field. - Minimum of 5 years of experience in master data management, preferably in the Aerospace and Defence industry. - Strong understanding of supply chain operations and processes. - Experience with ERP systems and master data management tools. - Excellent analytical and problem-solving skills. - Attention to detail and ability to work with large sets of data. - Strong communication and interpersonal skills. - Ability to work independently and in a team environment. If you are a self-motivated and detail-oriented individual with a passion for data management and a background in supply chain operations, we encourage you to apply for this exciting opportunity at Titan. Join our dynamic team and be a part of our mission to deliver high-quality products to our customers in the Aerospace and Defence industry. Work Experience Degree / Diploma / Engineering7+ years,Communication, Team Work, Strategy, Logical Decision making. Presentation, SAP- PP and MM, MS Office, Business Process.

Posted 3 days ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Chennai

Work from Office

Naukri logo

JD for Power Bi. Key Responsibilities : Lead BI Strategy : Drive the business intelligence strategy, helping clients or internal teams leverage Power BI and other BI tools to turn complex data into actionable insights. Power BI Implementation & Architecture : Design and oversee the implementation of Power BI solutions, from data modeling and ETL processes to reporting and dashboard creation. Data Governance & Quality : Ensure best practices for data governance, integrity, and consistency across BI initiatives. Mentorship & Leadership : Provide mentorship to junior and mid-level BI professionals, guiding them through technical challenges and career development. Cross-Functional Collaboration : Work closely with executive leadership, business analysts, data engineers, and other stakeholders to understand business requirements and translate them into technical solutions. Advanced Analytics : Design advanced analytics solutions using DAX, Power Query, and other Power BI tools to solve complex business problems. Performance Optimization : Optimize Power BI reports and dashboards to handle large datasets, ensuring high performance and responsiveness. Reporting & Insights : Deliver impactful reports and dashboards that help stakeholders make data-driven decisions, ensuring clear communication of findings. Strategic Advice : Offer expert advice on BI tools, methodologies, and best practices, contributing to high-level strategic decisions related to data-driven transformations. Innovation & Research : Stay up to date with the latest Power BI features and updates, and incorporate new techniques and technologies to continuously improve BI practices.

Posted 3 days ago

Apply

13.0 - 18.0 years

20 - 25 Lacs

Gurugram

Work from Office

Naukri logo

0px> Who are we? In one sentence We are seeking a GenAI Architect & People Manager with strong technical depth and leadership capabilities to lead our GenAI initiatives. The ideal candidate will possess a robust understanding of Generative AI , Machine Learning , and Cloud-based solution delivery , combined with proven experience in managing high-performing technical teams. This role requires a visionary who can translate business challenges into scalable AI solutions while nurturing talent and fostering innovation. What will your job look like? Lead the design and implementation of GenAI and ML-based solutions across business use cases. Translate business requirements into technical architectures using Azure/AWS Cloud Platforms . Manage and mentor a multidisciplinary team of engineers, data scientists, and ML specialists . Drive adoption of Databricks , PySpark , and Java-based frameworks within solution development. Collaborate closely with product owners, data engineering teams, and business stakeholders. Ensure high standards in code quality, system performance, and model governance . Track industry trends and continuously improve the GenAI strategy and roadmap. Oversee end-to-end lifecycle: use case identification, PoC, MVP, production deployment, and support. Define and monitor KPIs to measure team performance and project impact. All you need is... 13+ years of overall IT experience with a strong background in Telecom domain (preferred). Proven hands-on experience with GenAI technologies and Machine Learning pipelines . Strong understanding of LLMs , Prompt Engineering , RAG (Retrieval-Augmented Generation) , and fine-tuning. Demonstrated experience in building and deploying GenAI use cases on Azure or AWS . Strong expertise in Databricks , PySpark , and Java . In-depth understanding of cloud-native architecture , microservices , and data pipelines . Solid people management experience: team building, mentoring, performance reviews. Strong analytical thinking and communication skills. Good to Have Skills: Familiarity with LangChain, HuggingFace Transformers , and Vector Databases (like FAISS, Pinecone). Experience with Data Governance , MLOps , and CI/CD for AI/ML models. Certification in Azure/AWS (e.g., Azure AI Engineer, AWS Certified Machine Learning). Exposure to NLP , speech models , or multimodal AI . Why you will love this job: You will be challenged with leading and mentoring a few development teams & projects You will join a strong team with lots of activities, technologies, business challenges and a progression path You will have the opportunity to work with the industry most advanced technologies

Posted 3 days ago

Apply

14.0 - 19.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Join our Team About this opportunity: As an AI Architect, you will be responsible for designing and delivering scalable, secure, and innovative AI architectures aligned with enterprise and client needs. You will lead the continuous enhancement of AI platforms by enabling new features and capabilities that drive AI adoption and deliver measurable business value through practical use case implementations. Your role will be pivotal in shaping AI strategy, operationalizing AI solutions, and fostering strong client relationships. What you will do: AI Platform Enhancement & Innovation: Lead the evaluation, integration, and enablement of new AI platform features and technologies to continuously evolve AI capabilities and maintain competitive advantage. Use Case Identification & Implementation: Collaborate with business stakeholders to identify high-impact AI use cases, design tailored solutions, and oversee end-to-end delivery ensuring alignment with strategic objectives. Architecture Design & Governance: Develop and maintain comprehensive AI architectural blueprints and standards that ensure scalability, security, compliance, and interoperability within enterprise IT landscapes. Operationalization & MLOps: Architect AI/ML model lifecycle management solutions, including data pipelines, model training, deployment, monitoring, and governance, leveraging cloud and hybrid environments. Stakeholder Leadership: Lead cross-functional design workshops, gain stakeholder buy-in, and act as the principal technical escalation point for AI-related challenges. Security & Compliance: Collaborate with IT security and data governance teams to embed privacy, ethical AI principles, and compliance into AI solution architectures. Risk Management: Identify and mitigate AI-specific risks such as model bias, data privacy issues, and system vulnerabilities. Pre-Sales & Strategy: Partner with sales and business development teams to translate customer requirements into compelling AI solution architectures and proposals. The skills you bring: Proven experience in AI platform architecture, including enhancement, feature enablement, and integration of new AI technologies. Strong expertise in AI/ML frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) and cloud AI platforms such as AWS SageMaker, Azure AI, or Google AI Platform. Demonstrated ability to architect and operationalize AI pipelines and MLOps solutions in cloud and hybrid environments. Proficient in AI security, privacy, and ethical considerations, ensuring compliant and responsible AI deployments. Experience leading technical workshops, managing stakeholder expectations, and driving consensus on AI designs. Strong programming background and familiarity with container technologies (Docker, Kubernetes). Excellent communication skills to articulate complex AI concepts to both technical and non-technical audiences. Education & Experience Bachelor s or Master s degree in Computer Science, Artificial Intelligence, Data Science, or related field. 14+ years of IT experience with at least 7+ years focused on AI architecture, AI platform development, and solution delivery. Hands-on experience with AI model development, deployment, and lifecycle management. Proven track record of driving AI adoption through platform enhancements and use case implementations in enterprise settings. Why join Ericsson? What happens once you apply? Primary country and city: India (IN) || Bangalore Req ID: 768758

Posted 3 days ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Minimum of 5 years of exposure to Camunda version 8 design and development Hands on experience in Full stack development especially Java, Springboot, Micro-services, React JS/Angular Expertise in DevOps, CI-CD, Kubernetes, Terraform, Helm chart and EKS Experience with cloud infrastructure platforms (AWS preferred) and setting best practices, compliance, and data governance Integration experience using REST / JSON Knowledge on QA and automation Should be able to Define IT strategy, E2E solutioning, architecting and presenting it to wider audience Participating and planning scrum calls and track updates for each PODs Mentor and guide junior team members, fostering a culture of knowledge sharing and continuous learning.

Posted 3 days ago

Apply

7.0 - 12.0 years

15 - 17 Lacs

Hyderabad, Pune, Chennai

Work from Office

Naukri logo

Job Title: Senior Data Analyst - Data Governance About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Description: Mandate Skills : Data Analysis + Data Governance, Collibra, Python Location : Bangalore, Pune, Chennai, Hyderabad Notice: immediate to 30 Days Level: M3/M4 (7+ Years) Job Description 1.Design, document and advise on implementing Data Discovery and Data 2.Control Fix for a premier global bank in the wealth and personal banking segment extensively using Collibra 3.Responsible for updating and maintaining process metadata along with critical data elements, preferred business glossary and respective technical metadata for critical global services from various regions in the DG 4.Understand functions of various enterprise information management applications, map the data lineage of data elements along with the flow-types and consumption status 5.Work with data quality team and establish proactive data quality controls by implementing a strong and scalable governance process 6.Create and promote the use of common data assets, such as business glossaries, reference data, data inventories, data models and data catalogs within the organization thereby improving awareness about Data Governance 7.Monitor adherence to data policies and standards, governing potential policy deviations and escalating where necessary 8.Establish data quality standards, procedures and protocols to ensure the accuracy, completeness, and consistency of data across the organization 9.Assist in the implementation of data classification processes to protect sensitive information appropriately

Posted 3 days ago

Apply

7.0 - 12.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Minimum 7 years of relevant experience in Information Security, Data Governance, or Compliance roles. Manage Symantec DLP infrastructure (Network, Endpoint, and Cloud components). Maintain and migrate the DLP policies, rules, from Symantec DLP to Microsoft purview as per business needs. Configure and manage Microsoft Purview Information Protection & Data Governance policies including: Sensitivity labels and auto-labeling Insider risk management Data lifecycle policies Implement Microsoft Purview eDiscovery, Audit, and Compliance Manager solutions. Collaborate with Security, Legal, and Compliance teams to ensure M365 data compliance posture. Define and implement data retention schedules in alignment with legal, regulatory, and business requirements. Lead the implementation of archiving solutions (e.g., Microsoft Exchange Online Archiving, Azure Information Protection, third-party tools). Coordinate with Records Management and Legal teams to maintain defensible deletion and audit readiness. Support migrations and lifecycle management for legacy data stores. Hands-on expertise with Microsoft Purview, Microsoft 365 Security & Compliance Center. Strong understanding of data classification, encryption, auditing, and compliance standards (e.g., GDPR, HIPAA, SOX). .

Posted 3 days ago

Apply

5.0 - 8.0 years

25 - 30 Lacs

Prayagraj, Varanasi, Ghaziabad

Work from Office

Naukri logo

Does working for 150+ million children of Bharat excite you? Then this opportunity is for you! About us: We are a leading Conversational AI company that s revolutionizing education for millions worldwide. Our knowledge bots are already empowering 35 million users, and were at the forefront of shaping the future of EdTech in Naya Bharat. Were creating an omniverse in Conversational AI, where developers collaborate to innovate together. As part of our team, youll have a pivotal role in turning complex educational data into practical insights that drive real change. Were deeply committed to enhancing education for 150 million children in India, partnering with state departments and supporting national initiatives like Vidhya Samiksha Kendra under the National Education Policy 2020. ConveGenius operates across three divisions : ConveGenius Digital uses AI and bots to make systemic improvements, ConveGenius Edu offers Swift PAL tablets and AR-enhanced learning, and ConveGenius Insights leads global research in educational science. If you re passionate about making a meaningful impact in education, have experience in both business and social sectors, and thrive in fast-paced environments, join us in transforming EdTech for Naya Bharat. Embrace our startup culture, where innovation and determination reshape India s educational future. Learn more about us: https://linktr.ee/convegenius11 Key Responsibilities: Design, develop, and maintain data pipelines and ETL processes to efficiently ingest, transform, and load data from various sources into data warehouses and data lakes. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and design data models that facilitate efficient data retrieval and analysis. Optimize data pipeline performance, ensuring scalability, reliability, and data integrity. Implement data governance and security measures to ensure compliance with data privacy regulations and protect sensitive information. Identify and implement appropriate tools and technologies to enhance data engineering capabilities and automate processes. Conduct thorough testing and validation of data pipelines to ensure data accuracy and quality. Monitor and troubleshoot data pipelines to identify and resolve issues, ensuring minimal downtime. Develop and maintain documentation, including data flow diagrams, technical specifications, and user guides. Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. A masters degree is a plus. Proven experience as a Data Engineer or in a similar role, with a strong understanding of data engineering concepts, practices, and tools. Proficiency in programming languages such as Python, Java, or Scala, and experience with data manipulation and transformation frameworks/libraries (e.g., Apache Spark, Pandas, SQL). Solid understanding of relational databases, data modeling, and SQL queries. Experience with distributed computing frameworks, such as Apache Hadoop, Apache Kafka, or Apache Flink. Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and experience with cloud-based data engineering services (e.g., Amazon Redshift, Google BigQuery, Azure Data Factory). Familiarity with data warehousing concepts and technologies (e.g., dimensional modeling, columnar databases). What We Offer & Benefits: At ConveGenius, we believe in creating a supportive and dynamic work environment where you can thrive professionally and personally. If you re passionate about making a difference in education and enjoy working in a diverse and inclusive setting, ConveGenius is the place for you! Experience working with a diverse team of professionals located throughout India. Be part of an organization that operates in over two-thirds of Indias states. Play a crucial role in transforming the education sector in India. Enjoy the security and peace of mind that comes with health insurance coverage. Benefit from a flexible leave policy, including special provisions for period leaves.

Posted 3 days ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 days ago

Apply

3.0 - 5.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 days ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 days ago

Apply

5.0 - 8.0 years

17 - 25 Lacs

Bhubaneswar, Dubai, Coimbatore

Work from Office

Naukri logo

Please Note- Candidates okay to travel to Middle East only apply Role & responsibilities 1. Data Quality & Governance: Implement data quality frameworks, policies, and standards to ensure data accuracy, completeness, and consistency across enterprise systems. 2. Master Data Management (MDM) Implementation: Design and configure MDM solutions using Informatica MDM (On-Prem & Cloud) for key business domains (Customer, Product, Vendor, etc.). 3. Data Profiling & Cleansing: Leverage Informatica Data Quality for data profiling, cleansing, standardization, deduplication, and enrichment to improve data reliability. 4. Metadata Management & Data Lineage: Deploy and maintain Informatica Metadata Manager to enhance data discoverability, governance, and lineage tracking. 5. Integration & Interoperability: Ensure seamless integration of MDM and DQ solutions with core enterprise applications (ERP, CRM, BI tools), supporting ETL/ELT teams. 6. Stakeholder Collaboration: Act as a liaison between business and IT teams, translating business requirements into scalable MDM and DQ solutions. 7. Training & Support: Provide guidance, training, and best practices to data stewards and business users to drive a culture of data governance. Preferred candidate profile Education & Experience Bachelors/Masters degree in Computer Science, Data Management, Information Systems, or a related field. 5+ years of consulting experience in Data Quality, MDM, and Metadata Management Expertise in Informatica IDQ (or IDMC Cloud Data Quality), Informatica MDM (On-Prem & Cloud) Technical Skills Strong experience in data profiling, cleansing, standardization, and data deduplication. Hands-on knowledge of data governance frameworks, data quality rules, and stewardship best practices. Expertise in SQL, data modeling, and data architecture principles. Experience integrating MDM and DQ solutions with enterprise applications (SAP, Salesforce, Microsoft Dynamics, etc.). Familiarity with cloud platforms (MS Azure), with a focus on cloud-based data governance and integration. Experience in designing end-to-end DQ and MDM solutions Preferred Industry Experience Prior experience in DQ/MDM implementation within at least one of the following sectors: Oil & Gas, Financial Services, Manufacturing, Healthcare, Real Estate , Tourism , Government/Citizen Services , Mobility , Energy & Utilities, Telecom Consulting & Leadership Skills Strong stakeholder management and client engagement skills, with experience in working on DQ & MDM consulting projects. Pre-sales experience with the ability to build quick PoCs, Client demos, and support business development efforts.

Posted 3 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Mohali

Work from Office

Naukri logo

We are looking for a Snowflake Developer with 5+ years of experience in Snowflake Data Warehouse and related tools. You will build, manage, and optimize data pipelines, assist in data integration, and contribute to data architecture. The ideal candidate should understand data modeling and ETL processes, and have experience with cloud-based data platforms. Please confirm once you ve gained access, and let us know if you need further assistance. Key Responsibilities Design, develop, and maintain Snowflake Data Warehouses. Create and manage Snowflake schema, tables, views, and materialized views. Implement ETL processes to integrate data from various sources into Snowflake. Collaborate with Data Engineers, Data Scientists, and Analysts to build efficient data pipelines. Ensure data integrity, security, and compliance with data governance policies. Requirements Proficient in SQL, SnowSQL, and ETL processes Strong experience in data modeling and schema design in Snowflake. Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data pipelines, data lakes, and data integration tools. Experience in using tools like dbt, Airflow, or similar orchestration tools is a plus. Maintain records of the conversations with the customer and analyze the data. Handling customer queries on Chat and E-mails. Lorem Ipsum Work with us SourceMash Technologies is a leading solution provider for internet-based applications and product development since 2008. Be a part of our company that is facilitated by highly skilled professionals dedicated to providing total IT solutions under one roof. We offer remarkable services in the areas of Software Development, Quality Assurance, and Support. An employee welcome kit, like Custom Notepad, T-Shirt, Water Bottle etc., is also included in employee welcome packages onboard. SourceMash Technologies offers the best employee health insurance benefit to their employees family members under the same policy. Annual leaves are paid at the payment rate in the working period before the leave, and no untaken leaves can be considered part of the mandatory notice periods.

Posted 3 days ago

Apply

12.0 - 15.0 years

13 - 18 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Data & AI Strategy Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration. You will collaborate with various teams to ensure that the data architecture aligns with business objectives and supports the overall data strategy. You will also engage in discussions to refine data models and address any challenges that arise during the development process, ensuring that the data architecture is robust and scalable to meet future needs. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Mentor junior team members to enhance their skills and understanding of data architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data & AI Strategy.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with cloud data services and architectures.- Ability to design and implement data governance frameworks. Additional Information:- The candidate should have minimum 12 years of experience in Data & AI Strategy.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

13 - 18 Lacs

Gurugram

Work from Office

Naukri logo

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Data & AI Strategy Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business objectives and supports efficient data management and accessibility. You will collaborate with various teams to understand their data needs and provide innovative solutions that enhance data utilization across the organization. Roles & Responsibilities:- Expected to be a Subject Matter Expert with deep knowledge and experience.- Should have influencing and advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and discussions to gather requirements and feedback from stakeholders.- Continuously evaluate and improve data architecture practices to ensure scalability and performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data & AI Strategy.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Ability to design and implement data governance frameworks.- Familiarity with cloud-based data storage solutions and architectures. Additional Information:- The candidate should have minimum 15 years of experience in Data & AI Strategy.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Master Data Management (MDM) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing efficient Master Data Management solutions- Conduct regular reviews and assessments to ensure data accuracy and integrity- Stay updated on industry trends and best practices in MDM Professional & Technical Skills: - Must To Have Skills: Proficiency in Master Data Management (MDM)- Strong understanding of data governance principles- Experience in data modeling and data quality management- Knowledge of data integration tools and techniques- Good To Have Skills: Experience with data governance frameworks Additional Information:- The candidate should have a minimum of 5 years of experience in Master Data Management (MDM)- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education

Posted 3 days ago

Apply

10.0 - 12.0 years

37 - 40 Lacs

Pune

Work from Office

Naukri logo

JR: R00208204 Experience: 10-12Years Educational Qualification: Any Degree --------------------------------------------------------------------- Job Title - S&C- Data and AI - CFO&EV Quantexa Platform(Assoc Manager) Management Level: 8-Associate Manager Location: Pune, PDC2C Must-have skills: Quantexa Platform Good to have skills: Experience in financial modeling, valuation techniques, and deal structuring. Job Summary : This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. WHATS IN IT FOR YOU Accenture CFO & EV team under Data & AI team has comprehensive suite of capabilities in Risk, Fraud, Financial crime, and Finance. Within risk realm, our focus revolves around the model development, model validation, and auditing of models. Additionally, our work extends to ongoing performance evaluation, vigilant monitoring, meticulous governance, and thorough documentation of models. Get to work with top financial clients globally Access resources enabling you to utilize cutting-edge technologies, fostering innovation with the worlds most recognizable companies. Accenture will continually invest in your learning and growth and will support you in expanding your knowledge. Youll be part of a diverse and vibrant team collaborating with talented individuals from various backgrounds and disciplines continually pushing the boundaries of business capabilities, fostering an environment of innovation. What you would do in this role Engagement Execution Lead client engagements that may involve model development, validation, governance, strategy, transformation, implementation and end-to-end delivery of fraud analytics/management solutions for Accentures clients. Advise clients on a wide range of Fraud Management/ Analytics initiatives. Projects may involve Fraud Management advisory work for CXOs, etc. to achieve a variety of business and operational outcomes. Develop and frame Proof of Concept for key clients, where applicable Practice Enablement Mentor, groom and counsel analysts and consultants. Support development of the Practice by driving innovations, initiatives. Develop thought capital and disseminate information around current and emerging trends in Fraud Analytics and Management Support efforts of sales team to identify and win potential opportunities by assisting with RFPs, RFI. Assist in designing POVs, GTM collateral. Travel:Willingness to travel up to 40% of the time Professional Development Skills: Project Dependent Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Advanced skills in development and validation of fraud analytics models, strategies, visualizations. Understanding of new/ evolving methodologies/tools/technologies in the Fraud management space. Expertise in one or more domain/industry including regulations, frameworks etc. Experience in building models using AI/ML methodologies Modeling:Experience in one or more of analytical tools such as SAS, R, Python, SQL, etc. Knowledge of data processes, ETL and tools/ vendor products such as VISA AA, FICO Falcon, EWS, RSA, IBM Trusteer, SAS AML, Quantexa, Ripjar, Actimize etc. Proven experience in one of data engineering, data governance, data science roles Experience in Generative AI or Central / Supervisory banking is a plus. Strong conceptual knowledge and practical experience in the Development, Validation and Deployment of ML/AL models Hands-on programming experience with any of the analytics tools and visualization tools (Python, R, PySpark, SAS, SQL, PowerBI/ Tableau) Knowledge of big data, ML ops and cloud platforms (Azure/GCP/AWS) Strong written and oral communication skills Project management skills and the ability to manage multiple tasks concurrently Strong delivery experience of short and long term analytics projects Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience: 10-12Years Educational Qualification: Any Degree

Posted 3 days ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Noida

Work from Office

Naukri logo

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will collaborate with key stakeholders, including business representatives, data owners, and architects to model current and new data, contributing to data architecture decisions and solutions. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead data modeling efforts for various projects.- Develop and maintain data models for databases.- Ensure data integrity and quality in all data modeling activities.- Conduct data analysis to support business requirements. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles.- Experience with data modeling tools such as ERwin or PowerDesigner.- Knowledge of data governance and data quality best practices.- Good To Have Skills: Experience with data warehousing concepts. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Noida office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication among team members. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the application development process effectively- Ensure seamless communication among team members- Provide guidance and support to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool- Strong understanding of data governance principles- Experience in configuring and customizing SAP MDG Tool- Knowledge of SAP data models and structures- Hands-on experience in data migration and data quality management Additional Information:- The candidate should have a minimum of 12 years of experience in SAP Master Data Governance MDG Tool- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool, HTML 5, CSS 3, JavaScript, and Good to have skills : RESTful and SOAP-based web serMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business needs and technical specifications. Your role will require effective communication and coordination with stakeholders to drive project success and deliver high-quality solutions. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and best practices among team members.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool.- Strong understanding of data governance principles and practices.- Experience with application design and configuration.- Ability to lead cross-functional teams and manage stakeholder expectations.- Familiarity with project management methodologies. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP Master Data Governance MDG Tool.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the implementation of data platform solutions.- Conduct regular data platform performance assessments.- Identify and address data platform security vulnerabilities.- Stay updated on emerging data platform technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data platforms.- Experience in designing and implementing data pipelines.- Knowledge of data governance and compliance standards.- Experience with data modeling and database design. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies