Home
Jobs

2484 Data Quality Jobs - Page 22

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

3 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

We have a Job Requirement on Ataccama Production Support Data quality, governance, metadata management, Ataccama ONE platform ETL Developer or SQL Developer also Applicable Minimum 2+ Years of Experience Required Interview Mode: Face 2 Face

Posted 1 week ago

Apply

9.0 - 14.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities (how we will measure success) • Working within Business Architecture & Solution Design for our critical and proprietary global workflow platform, reporting into the Product Manager • You will be responsible for the ongoing project management of large-scale technical rollouts or deployments e.g. new global upgrades • You will be an excellent communicator and work with many international stakeholders to ensure internal and client facing teams are aware of implementation progress. Strong written communication skills to document progress, decisions and change requests • You will manage sprint planning and hold technical team to account • You will oversee testing of deployments and BAU bug fixes and requests • You will manage retrospectives • You will drive the communications plans for the platform • You will engage with the business on the product leveraging support from the product manager • You will lead inputs for steering committee / product committee in relation to ongoing projects and technical developments • You will have a hands-on approach, with knowledge of the platform and the business to enable standalone discussions with the business on detailed business requirements • You will work with remote and international colleagues to investigate any problem statements and new opportunities, seeking effective business solutions through improvements in either business processes or the platform • You will help drive communications around the platform ensuring stakeholders are aware of success stories and the benefits being driven internally and with clients Tasks (what does the role do on a day-to-day basis) • Take overall responsibility for managing the project implementations, working with many international stakeholders to drive progress • Project Manage Deployments in Agile sprints, helping the delivery manager and product manager understand the business urgency or priorities of requests for managing in the product backlog • Lead inputs and presentations for the preparation of the Product Committee and Steer Co meetings, informing and consulting international stakeholders on plans and questions for decision • Lead training calendars and maintenance of related training materials for the wider business, engaging with teams to ensure ongoing enhancements as needed • Work closely with the wider Operations & Technology teams based across 14+ countries to manage and oversee projects • Facilitate stakeholder meetings and workshops, and present findings and actions both verbally and in writing to the business • Help drive the platform embedding, ensuring data quality and maintenance is at the forefront of our stakeholders minds and all the relevant reports are being utilised • Support progression, development and mentoring of more junior team members internationally • Support discussions with other global platform teams across departments on alignment, integrations and best practice • Consider opportunities and potential risks attached to suggestions you make

Posted 1 week ago

Apply

3.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

The FICC Quant Developer should possess a robust understanding of Fixed Income, particularly in pricing calculations. The ideal candidate should be proficient in Python or other relevant programming languages and have experience in developing fixed income pricing models or calculations engines. Key Responsibilities Developing Fixed Income pricing/valuation and analytics models using statistical techniques Interacting with the trading and client senior technology team to analyze and understand their requirements Back testing models Working on implementing models on client calculation platforms; understanding of data quality nuances and ability to design rules around it Handle ad hoc requests for data analysis or building peripheral models. Experience Postgraduate in Economics/Financial Engineering/Maths/Physics, at least 8+ years of work exp. Proficient in econometrics, with prior experience in quantitative modeling specifically in Fixed Income Good understanding of Fixed income as an asset class, pricing and valuation and other analytics Coding skills: Python/C#/C++ etc. (advanced level) Having a good knowledge of databases preferably third-party providers like Bloomberg Advanced Excel/VBA and quantitative skills: Ability to work with and analyse large set of data and information. Excellent communication and interpersonal skills High level of independent thinking and approach Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 1 week ago

Apply

3.0 - 7.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

As a member of the Data and Technology practice, you will be working on advanced AI ML engagements tailored for the investment banking sector. This includes developing and maintaining data pipelines, ensuring data quality, and enabling data-driven insights. Your core responsibility will be to build and manage scalable data infrastructure that supports our proof-of-concept initiatives (POCs) and full-scale solutions for our clients. You will work closely with data scientists, DevOps engineers, and clients to understand their data requirements, translate them into technical tasks, and develop robust data solutions. Your primary duties will encompass: Develop, optimize, and maintain scalable and reliable data pipelines using tools such as Python, SQL, and Spark. Integrate data from various sources including APIs, databases, and cloud storage solutions such as Azure, Snowflake, and Databricks. Implement data quality checks and ensure the accuracy and consistency of data. Manage and optimize data storage solutions, ensuring high performance and availability. Work closely with data scientists and DevOps engineers to ensure seamless integration of data pipelines and support machine learning model deployment. Monitor and optimize the performance of data workflows to handle large volumes of data efficiently. Create detailed documentation of data processes. Implement security best practices and ensure compliance with industry standards. Experience / Skills 5+ years of relevant experience in: Experience in a data engineering role , preferably within the financial services industry . Strong experience with data pipeline tools and frameworks such as Python, SQL, and Spark. Proficiency in cloud platforms, particularly Azure, Snowflake, and Databricks. Experience with data integration from various sources including APIs and databases. Strong understanding of data warehousing concepts and practices. Excellent problem-solving skills and attention to detail. Strong communication skills, both written and oral, with a business and technical aptitude. Additionally, desired skills: Familiarity with big data technologies and frameworks. Experience with financial datasets and understanding of investment banking metrics. Knowledge of visualization tools (e.g., PowerBI). Education Bachelors or Masters in Science or Engineering disciplines such as Computer Science, Engineering, Mathematics, Physics, etc.

Posted 1 week ago

Apply

4.0 - 6.0 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

As a key member of the DTS team, you will primarily collaborate closely with a global leading hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to migrate scripts from Matlab to Python. Also, work on re-creation data visualizations using Tableau/PowerBI. Desired Skills and Experience Essential skills 4-6 years of experience with data analytics Skilled in Python, PySpark, and MATLAB Working knowledge of Snowflake and SQL Hands-on experience to generate dashboards using Tableau/Power BI Experience working with financial and/or alternative data products Excellent analytical and strong problem-solving skills Working knowledge of data science concepts, regression, statistics and the associated python libraries Interest in quantitative equity investing and data analysis Familiarity with version control systems such as GIT Education: B.E./B.Tech in Computer Science or related field Key Responsibilities Re-write and enhance the existing analytics process and code from Matlab to Python Build a GUI to allow users to provide parameters for generating these reports Store the data in Snowflake tables and write queries using PySpark to extract, manipulate, and upload data as needed Re-create the existing dashboards in Tableau and Power BI Collaborate with the firms research and IT team to ensure data quality and security Engage with technical and non-technical clients as SME on data asset offerings Key Metrics Python, SQL, MATLAB, Snowflake, Pandas/PySpark Tableau, PowerBI, Data Science Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 1 week ago

Apply

2.0 - 4.0 years

2 - 6 Lacs

Gurugram

Work from Office

Naukri logo

As a key member of the DTS team, you will primarily collaborate closely with a global leading hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to design data pipelines and delivery structures. Desired Skills and Experience Essential skills B.Tech/ M.Tech/ MCA with 2-4 years of overall experience. Skilled in Python and SQL. Experience with data modeling, data warehousing, and building data pipelines. Experience working with FTP, API, S3 and other distribution channels to source data. Experience working with financial and/or alternative data products. Experience working with cloud native tools for data processing and distribution. Experience with Snowflake and Airflow. Key Responsibilities Engage with vendors and technical teams to systematically ingest, evaluate, and create valuable data assets. Collaborate with core engineering team to create central capabilities to process, manage and distribute data assts at scale. Apply robust data quality rules to systemically qualify data deliveries and guarantee the integrity of financial datasets. Engage with technical and non-technical clients as SME on data asset offerings. Key Metrics Python, SQL. Snowflake Data Engineering and pipelines Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 1 week ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Job Purpose Evaluate the data governance framework and Power BI environment. Provide recommendations for enhancing data quality, and discoverability, and optimize Power BI performance. Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 7+ years of experience as a Data and Cloud architecture with client stakeholders Understand and review PowerShell (PS), SSIS, Batch Scripts, and C# (.NET 3.0) codebases for data processes. Assess complexity of trigger migration across Active Batch (AB), Synapse, ADF, and Azure Databricks (ADB). Define usage of Azure SQL DW, SQL DB, and Data Lake (DL) for various workloads, proposing transitions where beneficial. Analyze data patterns for optimization, including direct raw-to-consumption loading and zone elimination (e.g., stg/app zones). Understand requirements for external tables (Lakehouse) Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Key responsibilities include: Understand and review PowerShell (PS), SSIS, Batch Scripts, and C# (.NET 3.0) codebases for data processes. Assess the complexity of trigger migration across Active Batch (AB), Synapse, ADF, and Azure Databricks (ADB). Define usage of Azure SQL DW, SQL DB, and Data Lake (DL) for various workloads, proposing transitions where beneficial. Analyze data patterns for optimization, including direct raw-to-consumption loading and zone elimination (e.g., stg/app zones). Understand requirements for external tables (Lakehouse) Evaluate and ensure quality of deliverables within project timelines Develop a strong understanding of equity market domain knowledge Collaborate with domain experts and business stakeholders to understand business rules/logics Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 1 week ago

Apply

2.0 - 6.0 years

1 - 2 Lacs

Nagpur

Work from Office

Naukri logo

We are looking for a Standard Setter to join our team and ensure that our clients' data is accurate, complete, and meets industry standards. The ideal candidate should be a graduate with strong attention to detail, excellent analytical skills, and a proven track record of identifying and resolving data quality issues. Job responsibilities Collaborating with clients to understand their data requirements Developing quality standards and guidelines for data collection and analysis Implementing quality control procedures to ensure that data is accurate, complete, and meets industry standards. Identifying and resolving data quality issues, including data cleansing, transformation, and validation Conducting regular data quality audits and reporting on findings to clients Providing guidance and support to clients on data quality issues Maintaining accurate records of data quality checks and reporting Qualifications Bachelor's degree or higher in any field Strong attention to detail and analytical skills Experience with data analysis and quality control Familiarity with data management and reporting tools Excellent written and verbal communication skills Ability to work independently and as part of a team Strong problem-solving and critical thinking skills

Posted 1 week ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM OR Informatica Axon Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary: As an Application Lead for Informatica MDM, you will be responsible for leading the development and deployment of Informatica MDM solutions. Your typical day will involve working with cross-functional teams, designing and implementing MDM solutions, and ensuring data quality and integrity. Roles & Responsibilities: - Lead the design and implementation of Informatica MDM solutions, ensuring data quality and integrity. - Collaborate with cross-functional teams to understand business requirements and design MDM solutions that meet those requirements. - Develop and maintain MDM workflows, mappings, and data models. - Provide technical leadership and guidance to team members, ensuring adherence to best practices and standards. - Stay updated with the latest advancements in Informatica MDM and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Strong experience in Informatica MDM. - Good To Have Skills: Experience with Informatica PowerCenter, Oracle, and SQL Server. - Solid understanding of data modeling and data integration concepts. - Experience with MDM workflows, mappings, and data models. - Strong understanding of data quality and integrity concepts. - Experience with Agile development methodologies.

Posted 1 week ago

Apply

10.0 - 15.0 years

30 - 37 Lacs

Bengaluru

Hybrid

Naukri logo

Roles and Responsibilities : Take ownership of teams deliverables in the areas of data cataloguing, metadata enrichment, data quality management, master data management, data control and auditing Monitor teams deliverables very closely and handhold as required (team members experience range from 0 to 15 years). Lead the daily standup meetings, required follow ups with different stakeholders, etc. Report the status to superiors Must Have: 12-17 years of experience in data related projects 6+ years of deep hands-on experience in Data Governance, Data Quality, Metadata Management, Master Data Management Well-rounded skills and experience in design, development, and project management (this is required for the individual to be able to understand practical problems faced by the team members at different levels) Must have Agile delivery experience Solid conceptual knowledge, implementation experience and hands-on tool experience related to data governance, data quality, data cataloguing, metadata enrichment, metadata management, reference data management Good verbal and written communication skills Go-getter and aggressive in execution Should be able to work with very little direction Should be current with latest trends in cloud data engineering, AI/ML (at least at conceptual level) Hands-on experience in SQL, Python, DevOps, ETL, cloud data engineering is a big plus

Posted 1 week ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Hyderabad, Ahmedabad, Bengaluru

Work from Office

Naukri logo

The Team: The Private Markets Data Operations Team provides exceptional insights and analytics that empower clients in the private markets landscape. The purpose of this team is to generate unique market-leading data, ensuring accuracy, completeness, and timeliness. By leveraging innovative data collection and analytical techniques, we enhance transparency and deliver actionable insights that inform strategic decision-making. Our culture of innovation prioritizes Lean methodologies and automation to streamline processes and improve data integrity. Through strong stakeholder relationships and a deep understanding of market dynamics, we position ourselves as trusted partners, equipping clients with the intelligence needed to navigate opportunities and risks in a competitive environment. Responsibilities & Impact: In this role, you will significantly contribute to the data teams objectives by supporting the collection, analysis, and maintenance of datasets. Your work will directly influence the accuracy, completeness, and timeliness of the data provided, driving strategic decision-making. You will collaborate with team members to execute data quality initiatives and lead ad-hoc projects aimed at enhancing our data offerings. This role provides an opportunity to further develop your analytical and leadership skills while working with motivated individuals to generate actionable insights that support operational excellence. Responsibilities: Lead Data Collection Efforts: Gather and verify extensive data on market participants, utilizingpublicly available sources including but not limited towebsites, news articles, and various public registries & filings to provideaccurate insights and ensure thorough data coverage. Implement Data Quality Standards: Oversee and implement rigorous data cleansing processes to ensure high accuracy and consistency in datasets, utilizing best practices and methodologies. Conduct Advanced Reporting and Analysis: Perform in-depth reporting and trend analysis, generating comprehensive reports that provide actionable insights for strategic decision-making. Drive Automation and Efficiency: Spearhead automation initiatives for data collection and reporting tasks using SQL and Lean methodologies to optimize processes and enhance overall efficiency. Utilize Advanced Analytical Tools: Apply advanced analytical tools, including GenAI, for exploratory data analysis to extract deeper insights and facilitate informed decision-making. Ensure Compliance and Documentation: Maintain thorough documentation of data collection processes and compliance, contributing to continuous improvement initiatives and best practices. Achieve Performance Targets: Consistently deliver on individual and team targets with a strong emphasis on quality assurance and operational excellence. Contribute to Strategic Enhancements: Actively contribute to the development of new data collection methods and product enhancements to improve strategies based on market needs. Troubleshoot Complex Issues: Address and resolve complex data-related issues, provide support to team members, and foster a collaborative problem-solving environment. Lead Workflow Improvement Initiatives: Drive initiatives aimed at refining workflows and enhancing overall team performance through innovative process improvements. Preferred Qualification/What We Are Looking For: Masters degree in finance, economics, data science, or related fields. 1-2+ years of experience in data projects, including validation and cleansing techniques would be preferred. Strong analytical mindset with attention to detail and advanced quantitative skills. Proficiency in SQL and Excel; familiarity with BI tools (Tableau/Power BI) is essential. In-depth understanding of compliance in data collection and reporting. Exposure to Lean principles or automation tools is essential. Willingness to learn and adapt to modern technologies, including GenAI. Excellent communication skills with the ability to articulate complex data insights effectively. Strong time-management and multi-tasking skills to handle various business facets. Proficient in secondary research and online thematic research. Certification and experience in MS Office (Excel, Word, PowerPoint). Initiative and resourcefulness in problem-solving. Adaptable to flexible shifts and team environments. Strong project management skills for ad-hoc projects. Deep interest in market trends with the ability to analyze market dynamics. Strong collaboration and interpersonal skills to build relationships with stakeholders. Proactive in enhancing technical skills relevant to data analysis and reporting. Ability to share constructive feedback and foster a culture of continuous improvement.

Posted 1 week ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Primary Responsibilities: Reporting Development and Data Integration : Assist with data projects related to integration with our core claims adjudication engines, eligibility, and other database items as necessary Support the data leads by producing ad hoc reports as needed based on requirements from the business Report on key milestones to our project leads Ensuring all reporting aligns with brand standards Ensuring PADU guidelines for tools, connections, and data security Build a network with internal partners to assist with validating data quality Analytical Skills Utilization : Applying analytical skills and developing business knowledge to support operations Identify automation opportunities through the trends and day to day tasks to help create efficiencies within the team Perform root cause analysis via the 5 why root causing to identify process gaps and initiate process improvement efforts Assist with user testing for reports, business insights dashboards, and assist with automation validation review Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Degree or equivalent data science, analysis, mathematics experience Experience supporting operational teams performance with reports and analytics Experience using Word (creating templates/documents), PowerPoint (creation and presentation), Teams, and SharePoint (document access/storage, sharing, List development and management) Basic understanding of reporting using Business Insights tools including Tableau and PowerBI Expertise in Excel (data entry, sorting/filtering) and VBA Proven solid communication skills including oral, written, and organizational skills Proven ability to manage emotions effectively in high-pressure situations, maintaining composure, and fosters a positive work environment conducive to collaboration and productivity Preferred Qualifications: Experience leveraging and creating automation such as macros, PowerAutomate, Alteryx/ETL Applications Experience working with cloud-based servers, knowledge of database structure, stored procedures Experience performing root cause analysis and demonstrated problem solving skills Knowledge of R/Python, SQL, DAX or other coding languages Knowledge of multiple lines of business, benefit structures and claims processing systems At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyoneof every race, gender, sexuality, age, location and incomedeserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes an enterprise priority reflected in our mission.

Posted 1 week ago

Apply

12.0 - 15.0 years

45 - 55 Lacs

Bengaluru

Work from Office

Naukri logo

Join us as a Solution Designer Take on a varied role, where you ll own the end-to-end high-level business design for a project, programme or initiative You ll be working with a range of stakeholders to identify investment priorities, define opportunities and shape journeys to meet our strategic goals This is a chance to shape the future of our business and gain great exposure across the bank in the process Were offering this role at vice president level What youll do As a Solution Designer, you ll engage with relevant stakeholders as a single point of contact for design aspects. You ll be representing the design function at governance forums and working with enterprise architects to make sure standards and principles are adhered to. You ll also analyse requirements into coherent end-to-end designs, taking the business architecture into account. Other duties include: Translating requirements into a series of transition state designs and an executable roadmap Partnership with technology and data to develop a data product roadmap to support customer and reference data outcomes Documenting the relevant design in accordance with standard methods Designing systems and processes supporting data quality issue management across customer and reference data optimising for data quality remediation where possible The skills youll need You ll already have a background in solution design and experience of minimum ten years of using industry standard models and tools. Alongside good communication skills, you ll also need the ability to lead and collaborate with both internal and external teams. We ll also want to see: Knowledge of cloud data practices and data architecture A broad understanding of d ata lakehouse solutions like SageMaker in implementing effective data management practices Creative skills to design solutions to support the bank wide simplification program for customer and reference data Hours 45 Job Posting Closing Date: 01/07/2025

Posted 1 week ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking a seasoned Data Engineering Manager with 8+ years of experience to lead and grow our data engineering capabilities. This role demands strong hands-on expertise in Python, SQL, Spark , and advanced proficiency in AWS and Databricks . As a technical leader, you will be responsible for architecting and optimizing scalable data solutions that enable analytics, data science, and business intelligence across the organization. Key Responsibilities: Lead the design, development, and optimization of scalable and secure data pipelines using AWS services such as Glue, S3, Lambda, EMR , and Databricks Notebooks , Jobs, and Workflows. Oversee the development and maintenance of data lakes on AWS Databricks , ensuring performance and scalability. Build and manage robust ETL/ELT workflows using Python and SQL , handling both structured and semi-structured data. Implement distributed data processing solutions using Apache Spark / PySpark for large-scale data transformation. Collaborate with cross-functional teams including data scientists, analysts, and product managers to ensure data is accurate, accessible, and well-structured. Enforce best practices for data quality, governance, security , and compliance across the entire data ecosystem. Monitor system performance, troubleshoot issues, and drive continuous improvements in data infrastructure. Conduct code reviews, define coding standards, and promote engineering excellence across the team. Mentor and guide junior data engineers, fostering a culture of technical growth and innovation. Requirements 8+ years of experience in data engineering with proven leadership in managing data projects and teams. Expertise in Python, SQL, Spark (PySpark) ,

Posted 1 week ago

Apply

2.0 - 4.0 years

50 - 60 Lacs

Mumbai

Work from Office

Naukri logo

ISS STOXX is actively hiring for Data Quality Analyst to join our QBIT Team (Quality Assurance Benchmarking and Independent Testing) in Mumbai ( Goregaon East), India. ISS Governance Governance offerings include objective governance research and recommendations, and end-to-end proxy voting and distribution solutions. Institutional clients have long turned to ISS to apply their corporate governance views, identify environmental, social, and governance risk, and manage their complete proxy voting needs on a global basis. ISS covers approximately 44,000 meetings in 115 countries yearly, delivering proxy research and vote recommendations while working closely with clients to execute more than 10.2 million ballots representing 4.2 trillion shares. To learn more, visit https: / / www.issgovernance.com / solutions Overview: The QBIT Analytical Verification team independently verifies if various models, data solutions, data processes, and business logic related to corporate governance and responsible investing are implemented accurately. The analytical verification encompasses independently developing prototypes of models, data processes, and business logic, devising test cases and vectors, production code review, and documentation of the test results. An Analyst is expected to assist the team in all the stages of the analytical verification cycle. The candidate will be responsible for designing, planning, executing, and supporting the automated verification as well as software deployment and release management. This is a techno-functional role that blends the domain of Environment, Social and Corporate Governance (ESG) risk with data technology and analysis. You will primarily work on technologies such as MS SQL Server, Python and Power BI. The role does not involve statistical or quantitative modeling or developing machine learning models. Responsibilities: Develop prototypes of models, data processes, and business logic related to corporate governance and responsible investing products and data platforms Develop approach and framework to verify the models, data processes, and business logics underlying various analytical products and data solutions Data profiling and data quality analysis Backed by data analysis, assess and evaluate model specifications, data flow and processes and business logic for accuracy, completeness, thoroughness, and potential dislocations Assess the scope of different projects and define timelines, plans, and roadmap Collaborate and communicate with modeling, development, and product teams working in a global setting to plan and execute verification strategy. Document results and provide solutions Promote an environment that fosters a commitment to data and product quality, operational excellence, collaboration, and knowledge sharing Qualification: Experience - 2 to 4 years B.E./B.Tech./MCA with a strong programming aptitude Management Degree with specialization in Finance or someone who has worked in the financial domain with a good understanding in financial data Experience in Automation Testing using Python and SQL and Database Testing Or Strong in Data Analysis using Python and SQL Strong analytical, numeric, and problem-solving skills Ability to multi-task and prioritize tasks as needed Ability to effectively communicate and collaborate with global business and technical teams Self-starter and quick learner Ability to adapt and work in a fast-paced environment independently with little supervision This role will NOT involve working on Machine Learning techniques #ASSOCIATE #LI-PS1 What You Can Expect from Us At ISS STOXX, our people are our driving force. We are committed to building a culture that values diverse skills, perspectives, and experiences. We hire the best talent in our industry and empower them with the resources, support, and opportunities to grow professionally and personally. Together, we foster an environment that fuels creativity, drives innovation, and shapes our future success. Let s empower, collaborate, and inspire. Let s be #BrilliantTogether. About ISS STOXX ISS STOXX GmbH is a leading provider of research and technology solutions for the financial market. Established in 1985, we offer top-notch benchmark and custom indices globally, helping clients identify investment opportunities and manage portfolio risks. Our services cover corporate governance, sustainability, cyber risk, and fund intelligence. Majority-owned by Deutsche B rse Group, ISS STOXX has over 3,400 professionals in 33 locations worldwide, serving around 6,400 clients, including institutional investors and companies focused on ESG, cyber, and governance risk. Clients trust our expertise to make informed decisions for their stakeholders benefit. Visit our website: https://www.issgovernance.com View additional open roles: https: / / www.issgovernance.com / join-the-iss-team / .

Posted 1 week ago

Apply

3.0 - 8.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Systems Engineering Manager Back to job search results Tesco India Bengaluru, Karnataka, India Hybrid Full-Time Permanent Apply by 26-Nov-2025 About the role We are seeking a dedicated and skilled Observability Operations Engineer to join our team. In this role, you will be responsible for managing and optimizing the onboarding and maintenance of observability tools such as Splunk and New Relic. You will play a key role in applying best practices in observability, improving telemetry data quality, and providing exceptional support to our customers and internal teams. Additionally, you will collaborate with DevOps practices, managing configurations and automation workflows using GitHub. What is in it for you At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles -simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. You will be responsible for Manage and oversee the onboarding process for Splunk, New Relic, and other observability tools. Use GitHub to manage, version control, and automate onboarding configurations, scripts, and related workflows. Implement and promote best practices in observability across the organization. Continuously monitor and improve the quality of telemetry data to ensure accuracy and reliability. Collaborate with development, engineering, and product teams to enhance observability strategies. Provide customer-facing engineering support for users of observability tools, troubleshooting issues, and offering solutions. Develop and maintain documentation, runbooks, and knowledge bases related to observability practices. Drive initiatives to improve data collection, processing, and visualization for better insights. Work closely with the DevOps team to integrate monitoring solutions into CI/CD pipelines and automate deployment processes. Stay updated with the latest trends and technologies in observability and telemetry. You will need Proven experience with monitoring and observability tools such as Splunk, New Relic, Grafana, Prometheus, etc. Strong understanding of telemetry data, data quality practices, and observability principles. Experience with onboarding, configuring, and maintaining monitoring solutions. Hands-on experience with version control and automation using GitHub. Knowledge of DevOps practices, CI/CD pipelines, and scripting. Excellent troubleshooting, analytical, and problem-solving skills. Customer-oriented mindset with strong communication skills. Ability to work collaboratively across teams and manage multiple priorities. Preferred Skills: Knowledge of cloud platforms (AWS, Azure). Programming/scripting skills (Python, Bash, go etc.). Familiarity with ITSM tools and incident management. About us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues Tesco Technology Today, our Technology team consists of over 5,000 experts spread across the UK, Poland, Hungary, the Czech Republic, and India. In India, our Technology division includes teams dedicated to Engineering, Product, Programme, Service Desk and Operations, Systems Engineering, Security & Capability, Data Science, and other roles. At Tesco, our retail platform comprises a wide array of capabilities, value propositions, and products, essential for crafting exceptional retail experiences for our customers and colleagues across all channels and markets. This platform encompasses all aspects of our operations - from identifying and authenticating customers, managing products, pricing, promoting, enabling customers to discover products, facilitating payment, and ensuring delivery. By developing a comprehensive Retail Platform, we ensure that as customer touchpoints and devices evolve, we can consistently deliver seamless experiences. This adaptability allows us to respond flexibly without the need to overhaul our technology, thanks to the creation of capabilities we have built. At Tesco, inclusion is at the heart of everything we do. We believe in treating everyone fairly and with respect, valuing individuality to create a true sense of belonging. It s deeply embedded in our values we treat people how they want to be treated. Our goal is to ensure all colleagues feel they can be themselves at work and are supported to thrive. Across the Tesco group, we are building an inclusive workplace that celebrates the diverse cultures, personalities, and preferences of our colleagues who, in turn, reflect the communities we serve and drive our success. At Tesco India, we are proud to be a Disability Confident Committed Employer, reflecting our dedication to creating a supportive and inclusive environment for individuals with disabilities. We offer equal opportunities to all candidates and encourage applicants with disabilities to apply. Our fully accessible recruitment process includes reasonable adjustments during interviews - just let us know what you need. We are here to ensure everyone has the chance to succeed. We believe in creating a work environment where you can thrive both professionally and personally. Our hybrid model offers flexibility - spend 60% of your week collaborating in person at our offices or local sites, and the rest working remotely. We understand that everyone s journey is different, whether you are starting your career, exploring passions, or navigating life changes. Flexibility is core to our culture, and we re here to support you. Feel free to talk to us during your application process about any support or adjustments you may need. Apply

Posted 1 week ago

Apply

8.0 - 13.0 years

15 - 19 Lacs

Gurugram, Manesar

Work from Office

Naukri logo

Job Description The primary focus of this role is to support and work with platform managers in developing the necessary requirements and clarity to drive new digital customer solutions into the market. The Product Information Management (PIM) Analyst supports omnichannel PIM processes and serves as a central point of contact for system users, digital product/platform managers, and business stakeholders. This role is crucial in ensuring the accuracy, consistency, and completeness of product information across all digital channels. Principle Duties/Responsibilities : Support Ongoing Initiatives: Act as a subject matter expert (SME) to onboard new divisions and products into PIM. Develop relationships with divisional and business unit subject matter experts and product content specialists (PCS). Coordinate with stakeholders to define and expand data models and attribution for new products and acquisitions. Work with PCS to expand product and classification hierarchies to reflect NPIs, business acquisitions, and divisional consolidations. Support ideation, development, and training elements of digital service solutions in collaboration with business stakeholders in alignment with business objectives. Support cross-functional projects in partnership with Digital Product and Platform Owners, DS/IT, Finance, and customer operations centers to refine broad concepts and customer requirements into structured IT requirements that enable execution. Support PIM System Users: Manage governance operational processes, including Service Desk requests. Implement governance requests to support the OneAgilent omnichannel data model. Coordinate cross-functional tasks with content strategists and information architects to improve customer experience. Empower the localization team to operationalize translation processes. Drive continuous initiatives to improve the operational efficiency of PIM. Participate in Digital Channel projects to expand the PIM footprint for enhanced omnichannel experiences. Translate VOC (Voice of Customer) into actionable insights and partner with the digital portfolio manager to develop a long-term digital service product roadmap in collaboration and alignment with Product Marketing and other key stakeholders. Create and Deliver User Documentation and Training: Develop and manage user documentation, including operational and governance process flows and user playbooks. Onboard and train new PIM users. Create self-serve help tools, such as reference videos and FAQs, to support stakeholders. Collaborate with Digital Product Adoption teams to provide technical aspects of training on new solutions and define key metrics and dashboards to track KPIs. Support Data Modeling, Standardization, Cleanup, and Migration/Conversion: Work with data stewards and PCS to monitor data integrity. Support future onboarding of data into PIM to support eCommerce, self-serve, business intelligence, other business systems, and external customer/distributor needs. Conduct market research, experimentation, customer co-design, and competitive analysis to gain in-depth market intelligence when needed. Translate customer feature requirements and capture them into Agilent s IT technical documents for IT to scope out projects. Create and Maintain Import/Export Data Feeds and Reporting: Manage import and export configurations to support bulk data updates and system integrations. Manage export configurations to support distributor/partner/channel requirements (e.g., eCatalogs, China commerce). Generate system reports to support business requirements. Be accountable for tracking and delivering projects on-time and on-budget. Own test strategy and coordination of cross-functional teams for digital solution testing and business sign-off. Recommend enhancements for existing functionalities to improve ease of use for customers. Qualifications BS/MS degree in Business, Information Systems, or a related field. Overall, at least 8 years of experience. 5+ years of experience in PIM, data management, or a related role. Strong understanding of data modeling, data governance, and data quality principles. Experience with PIM systems and tools. Excellent verbal and written communication skills. Strong analytical and problem-solving skills. Ability to manage multiple projects and priorities. Experience working in a cross-functional team environment. Its a global Digital PIM analyst position. Thus, the position requires regular flexibility to attend to the global timezone needs and to collaborate with global stakeholders in different time zones. Additional Details This job has a full time weekly schedule. It includes the option to work remotely. Our pay ranges are determined by role, level, and location. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. During the hiring process, a recruiter can share more about the specific pay range for a preferred location. Pay and benefit information by country are available at: https: / / careers.agilent.com / locations Travel Required: Occasional Shift: Day Duration: No End Date Job Function: Marketing

Posted 1 week ago

Apply

2.0 - 10.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

About Gruve Gruve is an innovative software services startup dedicated to transforming enterprises to AI powerhouses. We specialize in cybersecurity, customer experience, cloud infrastructure, and advanced technologies such as Large Language Models (LLMs). Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks. About the Role We are looking for a highly motivated and performance-driven Manager - Lead Generation to lead and scale our lead generation function. This role is pivotal to building a strong and predictable sales pipeline by managing a team responsible for prospecting, outreach, and qualification of leads for our services and solutions. Key Responsibilities Team Leadership Management Lead, coach, and mentor a team of SDRs / lead generation specialists Set performance KPIs (e. g. , leads/MQLs generated, contact rate, conversion rate) Run weekly reviews, feedback sessions, and pipeline health checks Recruit, onboard, and train new team members Lead Generation Strategy Own the execution of outbound lead generation campaigns across channels (email, LinkedIn, phone, events) Build, refine, and optimize prospecting playbooks and cadences Monitor and improve the quality of leads passed to the BD/Sales team Work closely with marketing to align on ICP, messaging, and content Research Contact List Building Oversee prospect research to identify decision-makers and relevant target accounts Guide the team in building and maintaining accurate and segmented contact lists Ensure use of tools like LinkedIn Sales Navigator, ZoomInfo Apollo for data enrichment Drive best practices in targeting, segmentation, and lead qualification Tools Process Management Manage prospecting tools like Apollo, ZoomInfo, LinkedIn Sales Navigator, etc. Ensure CRM (Salesforce/HubSpot) hygiene and reporting accuracy Define and enforce structured outreach processes and SLAs Analytics Reporting Track team performance metrics and drive continuous improvement Deliver weekly and monthly dashboards to senior leadership Provide insights on what s working and what needs adjustment in targeting or messaging Required Skills Experience 6-10 years of experience in sales, lead generation, or inside sales 2+ years of experience managing a lead generation or SDR team Proven ability to drive qualified pipeline through outbound efforts Experience with sales automation tools (Apollo, Outreach, HubSpot, Salesforce, etc. ) Strong analytical, communication, and leadership skills Comfortable working in a target-driven, high-velocity environment Experience in IT services, SaaS, or consulting business preferred Success Metrics % of qualified leads meeting acceptance criteria Lead-to-opportunity conversion rate Team productivity (outreach volume, meetings booked) CRM compliance and data quality Why Gruve At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you re passionate about technology and eager to make an impact, we d love to hear from you. Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.

Posted 1 week ago

Apply

4.0 - 8.0 years

7 - 11 Lacs

Noida

Work from Office

Naukri logo

Design and develop robust data pipelines and ETL processes Support data migration from traditional systems (Oracle, Vertica) to modern cloud platforms (Snowflake, AWS) Build and maintain scalable data ingestion and transformation frameworks Optimize and tune data queries for performance and efficiency Implement and manage cloud-based data solutions (Snowflake, AWS, Azure) Work closely with architects and senior engineers on solution design Participate in Agile development practices (Scrum/Kanban) Conduct unit testing, troubleshoot issues, and ensure data quality Document data models, processes, and best practices Technical Skills: Cloud Platforms: AWS (preferred), Snowflake, Azure (nice to have) Databases: Oracle, SQL Server, Vertica, MongoDB Languages: Python, SQL, Shell scripting (Java is a plus) Tools: Spark (preferred), Tableau, Jenkins, Git, Docker (basic understanding) Frameworks: Familiarity with REST APIs and microservices architecture Data Skills: Data modeling, ETL, Data Migration, Data Quality Agile Methodologies: Experience working in Scrum/Kanban teams Mandatory Competencies Cloud - AWS Cloud - Azure Data on Cloud - Snowflake Beh - Communication Database - SQL At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, were committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees success and happiness.

Posted 1 week ago

Apply

4.0 - 8.0 years

15 - 17 Lacs

Noida

Work from Office

Naukri logo

Establish, maintain, and continuously improving governance frameworks, policies, and best practices in our area. Ensures that projects, programs, and portfolios align and operate with efficiency, transparency, and control. Develop/enhance technology methodology and standard practices, processes and tools with a focus on industry best practices to ensure effective and efficient practice delivery. Develop and deliver standard practices, processes and tools that are consistent and repeatable. Influence the implementation and adoption of methodology and new practices, processes and tools through development of strong practitioner community relationships. Key Responsibilities Determine scope of Cloud initiatives through research and fact-finding, combined with an understanding of applicable business requirements and technology. Partner with Service Delivery Manager on risks, issue management and resolution. Work with SDM and engineering team to maintain project plan containing objectives, timeline, priorities and risks - this includes milestones using designated tool sets. Coordinate requirements gathering sessions, stand-ups, meetings with business representatives Document requirements, program functions, data quality reports and analysis. Coordinate and support Production issues and fixes while delivering on pre-aligned agenda for the sprint. Ability to scope in a technically complex and fast- changing environment, respond calmly and rationally in a constantly changing, deadline driven environment. Point of contact during the project for all aspects of the cloud Infrastructure. Ensures a strong and seamless relationship by maintaining communications about the project to the stakeholders: business partners, management, and delivery. Responsible for regular status reports Stays up to date with technological and or/process developments and demonstrates knowledge and expertise with Cloud enablement and an ability to evaluate solutions. Required Qualifications Provide appropriate governance oversight to ensure that the practitioner community is adhering to standard methodology, processes and practices. Define the organizational measures required to determine the state of the practice area and if practitioners are operating successfully. Develop and administer the tools required to effectively measure practitioner skill assessments. Lead the development of a continuous feedback process for practitioners to identify process improvements. Facilitate the transformation from practice area process and tool introduction to internalization. Lead the delivery of improvements in practice, process and tool effectiveness. Lead cross functional teams to identify opportunities to strengthen existing processes, practices and tools. Plan, develop and lead the implementation of improvement recommendations. Support the user needs and functional capabilities of practice tools, enabling platforms that provide accurate and standard reflection of project agenda/health. Provide consulting and mentoring within technology practice area of expertise to practitioner community. Educate project execution leaders and practitioners on the benefits of practice area methodology, process and tool usage. Support ad-hoc needs for project resources by providing project/program start-up or on-going support within assigned technology practice area. Drive effective and efficient project delivery. Perform project delivery related governance and compliance functions as required. Partner with the appropriate vendor subject matter experts to develop and maintain tool documentation as well as design, develop and implement the required internal and external training (formal and informal) required to support the practice area resources at all competency levels. Provide support to the practice organization to improve the performance of practitioners through coaching, tool development or other assessment. Develop a sustainable training program to address the needs of new practitioners. Pro-actively keep current on latest industry practices, process and tool trends. Maintain up-to-date understanding of available resources including appropriate training, job aids and best practices. Mentor peers and more junior staff. Actively champion and contribute to the continuous improvement of the assigned practice area best practices using innovative ideas to increase the effectiveness of the practice organization. Lead and participate in project phase reviews and post implementation reviews. Preferred Qualifications AWS Cloud certifications PMP certification Ameriprise India LLP has been providing client based financial solutions to help clients plan and achieve their financial objectives for 125 years. We are a U. S. based financial planning company headquartered in Minneapolis with a global presence. The firm s focus areas include Asset Management and Advice, Retirement Planning and Insurance Protection. Be part of an inclusive, collaborative culture that rewards you for your contributions and work with other talented individuals who share your passion for doing great work. You ll also have plenty of opportunities to make your mark at the office and a difference in your community. So if youre talented, driven and want to work for a strong ethical company that cares, take the next step and create a career at Ameriprise India LLP. Full-Time/Part-Time Timings (4:45p-1:15a) India Business Unit AWMPO AWMPS Presidents Office Job Family Group Technology

Posted 1 week ago

Apply

0.0 - 8.0 years

12 - 13 Lacs

Pune

Work from Office

Naukri logo

Join us as a Data Analyst at Barclays. Step into the role of a Data and Records Governance, where you will work on best-in-class data governance and reporting function by leading, planning and remediating high-focus regulatory findings associated to data governance. You will be evolving the data lineage tooling, data quality tooling and operating model to make the creation and maintenance of data lineage and data controls more sustainable, as well as work with global business and technology teams to enable data governance in business value streams. If you are an experienced data practitioner who is passionate about discovering new data findings and driving change, this is a perfect role for you. To be successful as a Data Analyst, you should have experience with: Data and Record governance, data controls, data lineage and associated methodologies. Experience in data products, cloud and data warehouses Business Domain (Retail or Banking) and Regulatory reporting experience. Working in a regulated environment and solid understanding of data and control risk management. Some other highly valued skills may include: Understanding of different technologies around the execution of data control. Ability to proactively drive change. Exceptional stakeholder management skills to be able to maintain collaborative working relationships with key senior stakeholders. Experience of working in multiple large teams delivering complex services involving the highest standards of resilience, risk and governance controls. Proficiency in data analytics and insight generation to derive actionable insights from data. You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is for Pune Location Purpose of the role To develop, implement, and maintain effective governance frameworks for all data and records across the banks global operations. Accountabilities Development and maintenance of a comprehensive data and records governance framework aligned with regulatory requirements and industry standards. Monitoring data quality and records metrics and compliance with standards across the organization. Identification and addressing of data and records management risks and gaps. Development and implementation of a records management programme that ensures the proper identification, classification, storage, retention, retrieval and disposal of records. Development and implementation of a data governance strategy that aligns with the banks overall data management strategy and business objectives. Provision of Group wide guidance and training on Data and Records Management standard requirements. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc). to solve problems creatively and effectively. Communicate complex information. Complex information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous. Other Languages English: C1 Advanced Location - Pune,Bangalore,Hyderabad,Chennai,Noida

Posted 1 week ago

Apply

12.0 - 17.0 years

6 - 10 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

About Us: Narwal, with its Global Delivery Model, strategically expands its reach across North America, the United Kingdom, and an offshore development centre in India. Delivery cutting edge AI, Data and Quality Engineering solutions and consistently surpassing expectations, Narwal has achieved remarkable triple-digit growth rates year after year, earning accolades such as Inc. 5000, Best IT Services Company, Best Data Technology Company, and Partner of the Year with Tricentis. Our Vision : To be an expert in AI, Data, Cloud and Quality Engineering transformations, bold in our thinking and authentic in our relationships. About the team We are building a world class Global Data Solutions using Cloud technologies such as AWS, Snowflake, Five Tran, Astronomer for batch, near real time and streaming of data. This is your opportunity to join the rapidly growing international data team with a focus on driving a cloud adoption and enabling AI/ML solutions. Leveraging Safe Agile framework to execute programs efficiently and accelerate business value to our internal and external customers. What you will be doing This role is a key position within the Data Services Organization. You will be accountable for hands on development of high-quality data products, engagement with data requestors and guiding junior members on need-basis. You will need to drive end to end Data solution life cycle from business requirement to production release. Hands on development utilizing AWS tools, Snowflake, DBT, Astronomer Lead smaller team of data engineers to deliver data programs and provide guidance in technical design, architecture, troubleshooting and deployment Effectively engage with senior management and business partners to own and deliver solutions Proficient in python scripting and stored procedures Manage and develop processes for batch and real time using Five Tran and Kafka Experience in orchestration, monitoring tools like DBTs, Astronomer and Github version control Experience in leveraging GitHub Co-pilot for efficient code generation Proficiency in leveraging compute and storage resources in Cloud Data Platform for delivering cost effective data solutions Ability to deliver solutions with focus on data quality, performance and master data management Partner with the business teams/stakeholders to understand their strategic goals and help them drive value through data/analytics capability and technology Take accountability and ownership for all the solutions delivered with sense of urgency Ability to incorporate industry standard technology solutions into day-to-day delivery What you bring: 12+ years of related experience with bachelor s degree or 5 years and a master s degree 5-7 plus years of experience in ETL, data warehousing concepts (on-prem and cloud), database programming 3 plus years of experience in Snowflake/Python scripting Experience with any scripting languages, preferably JavaScript and Shell Scripting, AWS Services such as S3, EC2, Lambda, SQS and SNS Working knowledge of reporting tools like Tableau, Athena, Webfocus, Power BI Excellent communication and presentation skills, able to tailor results into language appropriate for the target audience Experience working in all stages of mature Agile Software Development Life Cycle Experience in handling very large Datasets and diverse data formats like parquet, iceberg, json Extensive experience capturing and translating complex business requirements into practical solutions Added bonus if you have: Experience in the payments processing industry Experience in handling sensitive data (PCI, PII) Familiarity with use of a wiki for documentation Experience with Scaled Agile Framework for Enterprises (SAFe) What we offer you Exciting opportunity to work with leading edge cloud-based technologies such as AWS & Snowflake as we transform our business to cloud first Varied and challenging work/opportunities to help you grow and develop your career A modern, international, and flexible work environment and a dedicated and motivated team Time to support charities and give back in your community A competitive salary and benefits Narwal is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. For more information please visit: https://www.narwalinc.com/

Posted 1 week ago

Apply

1.0 - 7.0 years

25 - 30 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

About Us: Narwal, with its Global Delivery Model, strategically expands its reach across North America, the United Kingdom, and an offshore development centre in India. Delivery cutting edge AI, Data and Quality Engineering solutions and consistently surpassing expectations, Narwal has achieved remarkable triple-digit growth rates year after year, earning accolades such as Inc. 5000, Best IT Services Company, Best Data Technology Company, and Partner of the Year with Tricentis. Our Vision : To be an expert in AI, Data, Cloud and Quality Engineering transformations, bold in our thinking and authentic in our relationships. About the team We are building a world class Global Data Solutions using Cloud technologies such as AWS, Snowflake, Five Tran, Astronomer for batch, near real time and streaming of data. This is your opportunity to join the rapidly growing international data team with a focus on driving a cloud adoption and enabling AI/ML solutions. Leveraging Safe Agile framework to execute programs efficiently and accelerate business value to our internal and external customers. What you will be doing This role is a key position within the Data Services Organization. You will be accountable for hands on development of high-quality data products, engagement with data requestors and guiding junior members on need-basis. You will need to drive end to end Data solution life cycle from business requirement to production release. Hands on development utilizing AWS tools, Snowflake, DBT, Astronomer Proficient in python scripting and stored procedures Develop processes for batch and real time using Five Tran, Kafka Experience in orchestration and monitoring tools like DBTs, Astronomer and version control such as GitHub Experience in leveraging GitHub Co-pilot for efficient code generation Proficiency in leveraging compute and storage resources in Cloud Data Platform for delivering cost effective data solutions Ability to deliver solutions with focus on data quality, performance and master data management Partner with the business users to understand business requirements and code solutions under the guidance of technical leads Take accountability and ownership for all the solutions delivered with sense of urgency What you bring: 3-7 years of related experience with bachelor s degree or 5 years and a master s degree 3-5 plus years of experience in ETL, data warehousing concepts (on-prem and cloud), database programming 1-3 years of experience in Snowflake/Python scripting Experience with any scripting languages, preferably JavaScript and Shell Scripting, AWS Services such as S3, EC2, Lambda, SQS and SNS Working knowledge of reporting tools like Tableau, Athena, Webfocus, Power BI Excellent communication and presentation skills, able to present technology solutions to technical and business users effectively Experience working in all stages of mature Agile Software Development Life Cycle Experience in handling diverse data formats like parquet, iceberg, json Experience capturing and translating complex business requirements into practical solutions Added bonus if you have: Experience in the payments processing industry Experience in handling sensitive data (PCI, PII) Familiarity with use of a wiki for documentation Experience with Scaled Agile Framework for Enterprises (SAFe) What we offer you Exciting opportunity to work with leading edge cloud-based technologies such as AWS & Snowflake as we transform our business to cloud first Varied and challenging work/opportunities to help you grow and develop your career A modern, international, and flexible work environment and a dedicated and motivated team Time to support charities and give back in your community A competitive salary and benefits Narwal is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. For more information please visit: https://www.narwalinc.com/

Posted 1 week ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

About Us: Narwal, with its Global Delivery Model, strategically expands its reach across North America, the United Kingdom, and an offshore development centre in India. Delivery cutting edge AI, Data and Quality Engineering solutions and consistently surpassing expectations, Narwal has achieved remarkable triple-digit growth rates year after year, earning accolades such as Inc. 5000, Best IT Services Company, Best Data Technology Company, and Partner of the Year with Tricentis. Our Vision : To be an expert in AI, Data, Cloud and Quality Engineering transformations, bold in our thinking and authentic in our relationships. About the team We are building a world class Global Data Solutions using Cloud technologies such as AWS, Snowflake, Five Tran, Astronomer for batch, near real time and streaming of data. This is your opportunity to join the rapidly growing international data team with a focus on driving a cloud adoption and enabling AI/ML solutions. Leveraging Safe Agile framework to execute programs efficiently and accelerate business value to our internal and external customers. What you will be doing This role is a key position within the Data Services Organization. You will be accountable for hands on development of high-quality data products, engagement with data requestors and guiding junior members on need-basis. You will need to drive end to end Data solution life cycle from business requirement to production release. Hands on development utilizing AWS tools, Snowflake, DBT, Astronomer Proficient in python scripting and stored procedures Manage and develop processes for batch and real time using Five Tran, Kafka Lead development for small/medium sized projects Experience in orchestration and monitoring tools like DBTs, Astronomer and version control such as GitHub Experience in leveraging GitHub Co-pilot for efficient code generation Proficiency in leveraging compute and storage resources in Cloud Data Platform for delivering cost effective data solutions Ability to deliver solutions with focus on data quality, performance and master data management Partner with the business teams/stakeholders to understand their strategic goals and help them drive value through data/analytics capability and technology Take accountability and ownership for all the solutions delivered with sense of urgency Ability to incorporate industry standard technology solutions into day-to-day delivery What you bring: 8-10 years of related experience with bachelor s degree or 5 years and a master s degree 5-7 plus years of experience in ETL, data warehousing concepts (on-prem and cloud), database programming 3 plus years of experience in Snowflake/Python scripting Experience with any scripting languages, preferably JavaScript and Shell Scripting, AWS Services such as S3, EC2, Lambda, SQS and SNS Working knowledge of reporting tools like Tableau, Athena, Webfocus, Power BI Excellent communication and presentation skills, able to tailor results into language appropriate for the target audience Experience working in all stages of mature Agile Software Development Life Cycle Experience in handling very large Datasets and diverse data formats like parquet, iceberg, json Extensive experience capturing and translating complex business requirements into practical solutions Added bonus if you have: Experience in the payments processing industry Experience in handling sensitive data (PCI, PII) Familiarity with use of a wiki for documentation Experience with Scaled Agile Framework for Enterprises (SAFe) What we offer you Exciting opportunity to work with leading edge cloud-based technologies such as AWS & Snowflake as we transform our business to cloud first Varied and challenging work/opportunities to help you grow and develop your career A modern, international, and flexible work environment and a dedicated and motivated team Time to support charities and give back in your community A competitive salary and benefits Narwal is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. For more information please visit: https://www.narwalinc.com/

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies