Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
30 - 37 Lacs
Bengaluru
Hybrid
Roles and Responsibilities : Take ownership of teams deliverables in the areas of data cataloguing, metadata enrichment, data quality management, master data management, data control and auditing Monitor teams deliverables very closely and handhold as required (team members experience range from 0 to 15 years). Lead the daily standup meetings, required follow ups with different stakeholders, etc. Report the status to superiors Must Have: 12-17 years of experience in data related projects 6+ years of deep hands-on experience in Data Governance, Data Quality, Metadata Management, Master Data Management Well-rounded skills and experience in design, development, and project management (this is required for the individual to be able to understand practical problems faced by the team members at different levels) Must have Agile delivery experience Solid conceptual knowledge, implementation experience and hands-on tool experience related to data governance, data quality, data cataloguing, metadata enrichment, metadata management, reference data management Good verbal and written communication skills Go-getter and aggressive in execution Should be able to work with very little direction Should be current with latest trends in cloud data engineering, AI/ML (at least at conceptual level) Hands-on experience in SQL, Python, DevOps, ETL, cloud data engineering is a big plus
Posted 1 week ago
18.0 - 23.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Job Title: Sr Director, Software Engineering-AI/ML Location: Bengaluru, India - (Hybrid) At Reltio , we believe data should fuel business success. Reltio s AI-powered data unification and management capabilities encompassing entity resolution, multi-domain master data management (MDM), and data products transform siloed data from disparate sources into unified, trusted, and interoperable data. Reltio Data Cloud delivers interoperable data where and when its needed, empowering data and analytics leaders with unparalleled business responsiveness. Leading enterprise brands across multiple industries around the globe rely on our award-winning data unification and cloud-native MDM capabilities to improve efficiency, manage risk and drive growth. At Reltio, our values guide everything we do. With an unyielding commitment to prioritizing our Customer First , we strive to ensure their success. We embrace our differences and are Better Together as One Reltio. We are always looking to Simplify and Share our knowledge when we collaborate to remove obstacles for each other. We hold ourselves accountable for our actions and outcomes and strive for excellence. We Own It . Every day, we innovate and evolve, so that today is Always Better Than Yesterday . If you share and embody these values, we invite you to join our team at Reltio and contribute to our mission of excellence. Reltio has earned numerous awards and top rankings for our technology, our culture and our people. Reltio was founded on a distributed workforce and offers flexible work arrangements to help our people manage their personal and professional lives. If you re ready to work on unrivaled technology where your desire to be part of a collaborative team is met with a laser-focused mission to enable digital transformation with connected data, let s talk! Job Summary: Lead a talented team of engineers, data scientists, and architects to create and implement scalable AI/ML solutions. You will be responsible for shaping AI strategy, driving innovation, and ensuring seamless integration of AI/ML models into Reltio s products. This high-impact leadership role requires deep technical expertise, strategic vision, and the ability to influence stakeholders across engineering, product, and business teams. Job Duties and Responsibilities: AI/ML Strategy & Leadership: Formulate and implement the AI/ML roadmap to align with Reltios business goals Team Management & Development: Lead, mentor, and expand a team of AI/ML engineers, data scientists, and software developers Innovation & Product Integration: Spearhead AI-driven innovations and ensure seamless integration of ML models into Reltios data management solutions Technical Excellence: Supervise the design, development, and optimization of AI/ML algorithms, large-scale data pipelines, and cloud-based ML architectures Collaboration & Stakeholder Engagement: Collaborate closely with product management, engineering, and customer success teams to deliver AI-powered solutions that enhance customer experience Operational Excellence: Ensure high availability, scalability, and security of AI/ML services in production Emerging Technologies: Stay ahead of industry trends by evaluating and adopting the latest advancements in AI/ML, LLMs, and MLOps Skills You Must Have: 18+ years of experience in software engineering, with at least 5+ years in AI/ML leadership roles Strong expertise in AI/ML model development, deep learning, NLP, and MLOps Experience with deploying ML models in cloud platforms (AWS, GCP, or Azure) and distributed computing frameworks Proven track record of delivering AI-powered SaaS solutions at scale Hands-on experience with ML frameworks such as TensorFlow, PyTorch, and Scikit-learn Knowledge of generative AI, LLMs, and advanced ML techniques Familiarity with Kubernetes, Docker, and CI/CD pipelines for ML deployment Expertise in big data technologies such as Spark, Kafka, Snowflake, or Databricks Strong leadership, stakeholder management, and communication skills Experience working in agile, fast-paced environments with global teams Prior experience building products using any of the popular LLM models like ChatGPT, Gemini, etc Skills That Are Nice to Have: Experience in enterprise data management, MDM, or data governance solutions Exposure to AI/ML compliance, bias mitigation, and ethical AI frameworks Why Join Reltio?* Health & Wellness: Comprehensive Group medical insurance, including your parents, with additional top-up options. Accidental Insurance Life insurance Free online unlimited doctor consultations An Employee Assistance Program (EAP) Work-Life Balance: 36 annual leaves, which include Sick leaves - 18, Earned Leaves - 18 26 weeks of maternity leave, 15 days of paternity leave Very unique to Reltio - 01 week of additional off as recharge week every year globally Support for home office setup: Home office setup allowance. Stay Connected, Work Flexibly: Mobile & Internet Reimbursement No need to pack a lunch we ve got you covered with a free meal. And many more ..
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous. Other Languages English: C1 Advanced Location - Pune,Bangalore,Hyderabad,Chennai,Noida
Posted 1 week ago
3.0 - 6.0 years
5 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Experience: 3 to 6 years CTC 10 LPA Location: Whitefield, Bangalore We are seeking a highly skilled AI Engineer with deep expertise in Large Language Models (LLMs) and Natural Language Processing (NLP) to design, fine-tune, and deploy cutting-edge AI-driven text applications. The ideal candidate will have hands-on experience in prompt engineering, transformer-based model fine-tuning, and ML Ops practices for scalable deployment. Key Responsibilities: Collaborate with Daimler s business units and enabling areas to identify and support high-impact AI use cases. Design and develop LLM-based applications using models such as GPT, LLaMA, BERT, etc. Fine-tune and optimize LLMs for tasks including sentiment analysis, summarization, translation, and conversational AI. Work with embeddings, tokenization, and vector databases (e.g., FAISS, Pinecone) for semantic search and retrieval. Integrate AI models into production systems in collaboration with product and engineering teams. Apply ML Ops best practices for model lifecycle management, including versioning, monitoring, and CI/CD pipelines. Ensure compliance with ethical AI standards, data privacy regulations, and GDPR. Communicate findings and recommendations effectively to executive stakeholders through high-quality deliverables and presentations. Required Skills & Qualifications: Proven experience with NLP libraries and frameworks such as Hugging Face Transformers, spaCy, NLTK, and OpenAI API. Solid understanding of ML Ops tools and practices for managing AI workflows in production. Strong understanding of transformer architectures and experience fine-tuning LLMs. Proficiency with vector databases and semantic search technologies. Experience with model optimization techniques such as quantization and distillation. Experience with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes). Familiarity with data governance, model interpretability, and responsible AI frameworks. Demonstrated success in deploying at least one LLM-based application to production. Strong problem-solving skills, attention to detail, and ability to work in cross-functional teams.
Posted 1 week ago
8.0 - 12.0 years
12 - 17 Lacs
Bengaluru
Work from Office
MDM Field Engineer Build Your Career at Informatica We seek innovative thinkers who believe in the power of data to drive meaningful change. At Informatica, we welcome adventurous, work-from-anywhere minds eager to tackle the worlds most complex challenges. Our employees are empowered to push their bold ideas forward, and we are united by a shared passion for using data to do the extraordinary for each other and the world. MDM Field Engineer - Remote, US Were looking for MDM Field Engineer candidate with experience in pre-sales technical support, strategic customer engagement focused to join our team as remote. You will report to the Associate Director You will act as a trusted advisor to both technical and executive stakeholders, delivering complex product solutions, driving customer success, and providing feedback to influence product direction. Technology Youll Use Data Integration, Master Data Management, Data Governance, Data Quality Your Role Responsibilities? Heres What Youll Do Deliver pre-sales technical support and solution consulting for Informaticas cloud-based data governance, data integration, and master data management solutions. Work collaboratively with solution architects, customer success teams, and sales leadership to develop and present tailored solutions addressing complex customer requirements. Engage with stakeholders across all levels from technical teams to executive leadership to articulate Informaticas value proposition and demonstrate technical proof of concepts (POCs), pilots, and demos. Provide expert guidance and prepare comprehensive product specifications to meet customer use cases and support successful implementations. Mentor and enable junior engineers and architects through knowledge sharing, creation of reusable assets, and best practices. Partner closely with product management and engineering teams to relay customer feedback, identify product enhancements, and participate in roadmap planning. Balance customer-facing technical responsibilities with strategic influence, helping shape data management solutions that solve real-world business challenges. Support RFP responses and marketing activities as a technical subject matter expert. Maintain up-to-date certifications and continuously enhance your knowledge of cloud platforms, data security, and Informatica products. What Wed Like to See Demonstrate to and engage with a wide audience, including senior leadership. Working in cross-functional teams across sales, product management, and engineering. Passionate about innovation and continuous learning, evolving technological landscape. Role Essentials 8-12 years of relevant experience in data integration, master data management, data governance, or related technical consulting roles. Communications, Collaboration and team player Proven in both technical and business contexts, engaging audiences ranging from hands-on engineers to C-level executives. Proficiency in at least one scripting or coding language and intermediate certification(s) in cloud ecosystems such as AWS, Azure, or GCP. Solid understanding of cloud computing security concepts and experience advising integrations with applications like Salesforce, SAP, or custom solutions. Strong expertise in Informatica products, with an ability to create strategic proof plans and execute successful customer engagements. Bachelors degree or equivalent experience; relevant professional certifications BA/BS or equivalent educational background, we will consider an equivalent combination of relevant education and experience Minimum 8+ years of relevant professional experience Perks & Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit
Posted 1 week ago
10.0 - 14.0 years
13 - 17 Lacs
Hyderabad
Work from Office
Extensive expertise in Microsoft 365, Entra ID, and multi-tenant capabilities of these platforms. Expertise in Microsofts latest multi-tenancy features called MTO (Multi Tenant Organization) Expertise in setting up and configuring Microsoft M365 collaboration tools including Microsoft Teams, SharePoint Onling, Viva Engage, and Exchange Online, especially in a multi-tenancy setting Experience with Entra ID and M365 security protocols, compliance and data governance best practices. Experience with Microsoft Purview, especially Data Loss Prevention Having worked with Microsofts engineering teams on multi-tenancy feature development previously would be a big plus. Familiarity with automation tools such as Power Automate to implement any necessary approval flows for SharePoint or Teams to ensure existing information sharing policies of our group are followed. Familiarity with Entra ID and M365 licensing models and requirements for MTO. Primary Skills Microsoft 365 Entra ID Power Automate MTO Secondary Skills SharePoint Microsoft Purview
Posted 1 week ago
12.0 - 14.0 years
15 - 25 Lacs
Vapi
Work from Office
Role & responsibilities Looking out for candidates from Textile Industry The MDM Senior Team Lead working from GCC is expected to work closely with the Business Data Owners, Business IT and Group IT Organizations to operate and improve upon the "Enterprise Master Data Management" as per the SOP and provide it as a service back to all BUs in the group. This position will support operational activities such as Creation, Modification, Extension etc of Master data. Also, tasks such as monitoring and analysing of master data, identifying gaps in the key data elements and taking actions proactively to have the data corrected, completed on concurrency and consistent basis are expected. Responsibility* Master Data Life Cycle: • To ensure Master data creation / modification requests are performed through the designated systems in a controlled manner adhering to the established SOPs. This includes Masters related to Finished / Semi Finished Goods, Purchase Materials, Equipment & Task Lists, Vendors, Customers, Services and Chart of Accounts. • To manage a team of Data stewards who perform the MDM tasks as per the established SOPs • To assist in the application and implementation procedures of data standards and guideline on data ownership, data structures, and data replication to ensure access to and integrity of master data sets. • To partner with business units leadership and process experts to resolve master data issues when they arise. • To assist entities in data migration activities, UATs and system checks during ERP upgrades like S4/HANA implementations. Master Data Analysis: • To ensure master data integrity in key systems as well as maintaining the processes to support the data quality. • To identify areas for data quality improvements and help to resolve data quality problems through the appropriate choice of error detection and correction, process control and improvement, or process design strategies. • To ensure quality of master data in key systems, as well as, development and documentation of processes with other functional data owners to support ongoing maintenance and data integrity. • In collaboration with subject matter experts and data stewards, define and implement data strategy, policies, controls, and programs to ensure the enterprise data is accurate, complete, secure, and reliable. Master Data Governance : • To manage GCCS in-house Master Data Governance Tool as a front end interface between users and ERP to ensure system imposed controls and monitoring in MDM tas. • To manage and monitor BI Dashboard to keep track of KPIs and SLAs • To develop and implement strategies to translate business requirements and models into feasible and acceptable data designs to ensure that business needs are met. • To support the business with required procedures, submit incidents and change requests when needed Note: Please refer to the picklist master; the values should be taken from that picklist. 1. Top Management 2. Mid Management 3. Cross functional Collaborations 4. Client Relations 5. Financial Auditing
Posted 1 week ago
4.0 - 8.0 years
6 - 10 Lacs
Hyderabad, Ahmedabad
Work from Office
Grade Level (for internal use): 08 Grade Level (for internal use): 8 Job Title: Lead Data Analyst Functional Domain: Enterprise Data Organization Job Overview: The Lead Data Analyst will play a critical role in the CCST team, working in partnership with the Quality Assurance team. The role involves managing client inquiries, analyzing banking datasets, and leveraging advanced data manipulation skills. This position requires proficiency in SQL, and experience with Python, Salesforce, and Power BI is preferred. The successful candidate will thrive in a global team environment, demonstrating a strong commitment to data integrity and client satisfaction. Key Responsibilities: Client Management & Inquiry Resolution: Act as the primary contact for managing and resolving client inquiries, ensuring exceptional service and satisfaction. Data Analysis & Insights: Analyse client questions to uncover actionable insights that support strategic business decisions. Quality Assurance Participation: Participate in the execution of sampling workflow and quality assurance initiatives to enhance data accuracy and reliability. Data Visualization: Design and develop intuitive Power BI dashboards to facilitate effective data visualization for stakeholders. Advanced Data Manipulation: Employ advanced SQL and Python skills for comprehensive data manipulation and analysis, driving product enhancement. Cross-Functional Collaboration: Collaborate with cross-functional teams to ensure data initiatives align with organizational objectives, fostering a global mindset. Shift Adaptability: Adapt to varying shifts as required, ensuring consistent and reliable support across diverse time zones. What's In It for You: Collaborative Environment: Work closely with client-facing teams to enhance service delivery and client satisfaction. Quality Focus: Engage with the Quality Assurance team to ensure data accuracy and integrity across all analyses. Career Growth: Opportunity to progress within the organization by leveraging strong technical knowledge and expertise. Skill Development: Enhance your skills in data analysis, visualization, and client management through continuous learning and development. Global Exposure: Be part of a diverse team that operates across different time zones, promoting a global mindset and cultural understanding. Basic Required Qualifications: Bachelors degree in Data Science, Computer Science, Finance, or a related field. Proven experience working with banking datasets and managing client inquiries. Proficiency in SQL for data analysis and manipulation. Ability to work in varying shifts to accommodate global business needs. Key Soft Skills: Strong attention to detail and analytical thinking. Excellent communication skills, both written and verbal. Ability to work collaboratively in a team-oriented environment. Flexibility and adaptability to changing work conditions and priorities. Additional Preferred Qualifications: Experience with Python for advanced data analysis and automation. Familiarity with Salesforce for client relationship management. Proficiency in Power BI for creating impactful data visualizations. Demonstrated ability to quickly learn and apply new technologies. Previous experience in a client-facing role within the financial sector.
Posted 1 week ago
5.0 - 10.0 years
8 - 10 Lacs
Pune
Work from Office
ASIA IT: MDM & Data Governance Specialist We are looking for a MDM & Data Governance Specialist based in India to join our young dynamic ASIA IT Team. You will be responsible to implement/manage Master-Data-Management solutions and work actively with team(s) to implement global data governance structure/policy and our Global BI system(s). About You Were looking for someone who loves to learn, share knowledge, and is a team player someone who can bring a fresh perspective and one who is constantly seeking better practice for the overall quality of our work and our customer experience. *************************************************************************** Key Areas of Responsibility Develop & manage Master Data Management solution. Development of Global BI system(s) Lead MDM and Data Governance support Participate in Global Project(s) Gather feedback from business users and understand the requirements Define/Validate IT data process and Data Governance Work with other Regional/Global IT teams/Vendor Assists in project planning and scheduling Vendor Coordination & Vendor-Contract Management Qualifications / Experience Diploma/relevant qualification in IT or Computer Science 5+ Years experience in MDM, BI space, Data Governance. Exposure to end-user environments with different application technologies (ERP, CRM, ) Knowledge and exposure to Cloud/SaaS environments Fluent in written and spoken English Team player with experience working with multi-cultural team / MNC Pro-active and independent Good knowledge of Business process (Preferably in Manufacturing domain) knowledge and exposure working with different modes of integrations ( ETL, ELT, ESB.) Ability to prioritize assignments. Excellent communication skills, both written and verbal, with the ability to present complex ideas to technical and non-technical audiences. Excellent analytical and problem-solving skills with an understanding of existing and emerging technologies. Ability to prioritize workload and work well under pressure to meet deadlines and manage business expectations. Regional travel will be required Experience working in the manufacturing domain is a plus Willingness to learn new applications is a plus
Posted 1 week ago
7.0 - 12.0 years
20 - 35 Lacs
Noida, Chennai
Hybrid
Deployment, configuration & maintenance of Databricks clusters & workspaces Security & Access Control Automate administrative task using tools like Python, PowerShell &Terraform Integrations with Azure Data Lake, Key Vault & implement CI/CD pipelines Required Candidate profile Azure, AWS, or GCP; Azure experience is preferred Strong skills in Python, PySpark, PowerShell & SQL Experience with Terraform ETL processes, data pipeline &big data technologies Security & Compliance
Posted 1 week ago
6.0 - 10.0 years
8 - 12 Lacs
Chennai, Bengaluru
Work from Office
KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices Location: Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad
Posted 1 week ago
6.0 - 8.0 years
8 - 10 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Senior Data Engineer (Remote, Contract 6 Months) Databricks, ADF, and PySpark. We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background #ContractDetails Role: Senior Data Engineer Mode: Remote Duration: 6 Months Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune
Posted 1 week ago
12.0 - 20.0 years
35 - 50 Lacs
Bengaluru
Hybrid
Data Architect with Cloud Expert, Data Architecture, Data Integration & Data Engineering ETL/ELT - Talend, Informatica, Apache NiFi. Big Data - Hadoop, Spark Cloud platforms (AWS, Azure, GCP), Redshift, BigQuery Python, SQL, Scala,, GDPR, CCPA
Posted 1 week ago
2.0 - 7.0 years
20 - 25 Lacs
Chennai
Work from Office
The Data Engineering manager needs to be we'll versed with Microsoft Business Intelligence stack having strong skills and experience in development and implementation of BI and advanced analytics solutions as per business requirements. Strong hands-on experience in Microsoft ADF pipelines, databricks notebooks, Pyspark. Adept in design and development of data flows using ADF. Expertise in implementing complex ETL logics through databricks notebooks Experience in implementing CI-CD pipelines through azure devops Experience in writing complexT- SQLs Understand the business requirements and develop data models accordingl Have knowledge and experience in prototyping, designing, and requirement analysis. Excellent knowledge in data usage, Scheduling, Data Refresh and diagnostics Experience in tools such as Microsoft Azure, SQL data warehouse, Visual Studio, etc Worked in an agile (scrum) environment with globally distributed teams Analytical bent of mind Business acumen and articulation skills; ability to capture business needs and translate into a solution Ability to manage interaction with business stakeholders and others within the organization Good communication and documentation skills Proven experience in interfacing with different source systems Proven experience in data modelling Minimum required Education: BE Computer Science / MCA / MSC IT Minimum required Experience: Minimum 2 years of experience in Data Engineering or equivalent with Bachelors Degree. Preferred Certification: Azure ADF/Databricks/T-SQL Preferred Skills: Azure ADF/Databricks PySpark / T-SQL Data Governance Data Harmonization & Processing Data Quality Assurance Business Intelligence Tools Requirements Analysis Root Cause Analysis (RCA) Requirements Gathering
Posted 1 week ago
2.0 - 6.0 years
5 - 8 Lacs
Kochi
Work from Office
> Job Summary: We are seeking a highly skilled and motivated Machine Learning Engineer with a strong foundation in programming and machine learning, hands-on experience with AWS Machine Learning services (especially SageMaker) , and a solid understanding of Data Engineering and MLOps practices . You will be responsible for designing, developing, deploying, and maintaining scalable ML solutions in a cloud-native environment. Key Responsibilities: Design and implement machine learning models and pipelines using AWS SageMaker and related services. Develop and maintain robust data pipelines for training and inference workflows. Collaborate with data scientists, engineers, and product teams to translate business requirements into ML solutions. Implement MLOps best practices including CI/CD for ML, model versioning, monitoring, and retraining strategies. Optimize model performance and ensure scalability and reliability in production environments. Monitor deployed models for drift, performance degradation, and anomalies. Document processes, architectures, and workflows for reproducibility and compliance. Required Skills & Qualifications: Strong programming skills in Python and familiarity with ML libraries (e.g., scikit-learn, TensorFlow, PyTorch). Solid understanding of machine learning algorithms , model evaluation, and tuning. Hands-on experience with AWS ML services , especially SageMaker , S3, Lambda, Step Functions, and CloudWatch. Experience with data engineering tools (e.g., Apache Airflow, Spark, Glue) and workflow orchestration . Proficiency in MLOps tools and practices (e.g., MLflow, Kubeflow, CI/CD pipelines, Docker, Kubernetes). Familiarity with monitoring tools and logging frameworks for ML systems. Excellent problem-solving and communication skills. Preferred Qualifications: AWS Certification (e.g., AWS Certified Machine Learning - Specialty). Experience with real-time inference and streaming data. Knowledge of data governance, security, and compliance in ML systems.
Posted 1 week ago
6.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Job Title: Data Governance & Management Associate Role Description The Compliance and Anti-Financial Crime (CAFC) Data Office is responsible for Data Governance and Management across key functions including AFC, Compliance, and Legal. The team supports these functions in establishing and improving data governance to achieve critical business outcomes such as effective control operation, regulatory compliance, and operational efficiency. The CAFC Data Governance and Management team implements Deutsche Banks Enterprise Data Management Frameworkfocusing on controls, culture, and capabilitiesto drive improved data quality, reduce audit and regulatory findings, and strengthen controls. As a member of the Divisional Data Office, the role holder will support both Run-the-Bank and Change-the-Bank initiatives, with a particular focus on Financial Crime Risk Assessment (FCRA) data collation, processing, testing, and automation. Your key responsibilities Document and maintain existing and new processes; respond to internal and external audit queries and communicate updates clearly to both technical and non-technical audiences. Independently manage the FCRA data collection process across multiple metrics, including template generation, data collection, quality checks, and stakeholder escalation. Perform variance analysis and develop a deep understanding of underlying data sources used in Financial Crime Risk Assessment. Collaborate with TDI on new releases and ensure new data sources align with Deutsche Banks Data Governance standards. Maintain metadata in Collibra, visualize data lineage in Solidatus, and ensure certification and control coverage. Automate manual data processes using tools such as Python, SQL, and Power Query to improve efficiency and reduce operational risk. Translate complex technical issues into simple, actionable insights for business stakeholders, demonstrating strong communication and stakeholder management skills. Your skills and experience 6+ years of experience in data management within financial services, with a strong understanding of data risks and controls. Familiarity with industry-standard frameworks such as DCAM or DAMA (certification preferred). Hands-on experience with Data cataloguing using Collibra, Data lineage documentation using Solidatus and Data control assessment and monitoring Proficiency in Python, SQL, and Power Query for data analysis and automation. Strong communication skills with the ability to explain technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively across global teams.
Posted 1 week ago
8.0 - 13.0 years
5 - 9 Lacs
Chennai
Work from Office
Role Purpose The purpose of this role is to execute the process and drive the performance of the team on the key metrices of the process. Job Details Country/Region: India Employment Type: Remote Work Type: Contract State: Tamil Nadu City: Chennai Requirements Need three consultants with a mix of CFIN and RTR knowledge. If someone only has either CFIN and/or RTR, we will consider them. Looking for offshore resource working from 7 AM to 4 PM ET hours. Please find below job description for CFIN and RTR workstream. Minimum requirements: Education: Bachelor s degree foreign educational equivalent or equivalent based on combination of education and/or experience. Required Experience: In addition to degree or equivalent, must have all the following: Minimum 8 years of SAP experience in implementing and support of FICO, GL, AR, AP, AA, CO-PA, Lockbox processes, Cash Applications, FSCM and integration to other process areas Sales & Distribution (SD), Materials Management (MM), Production Planning (PP), Business Warehouse (BW). Minimum 5 years of SAP functional experience in a Fortune 1000 or like enterprise, including requirements gathering, design and customization. Minimum 2 years of hands-on experience in SAP Financials & Controlling (FICO) configuration in SAP S/4HANA system following SAP Activate methodology. Minimum 2 full lifecycle implementations in SAP S/4HANA specific to SAP Financials and Controlling area. Minimum 2 years of experience with SAP S/4HANA Central Finance (CFIN) reporting, Central Payments, Central Tax Reporting, Central Asset Accounting, Central Collections Management & Central Dispute Management. Minimum 2 years of experience with SAP S/4HANA Record to Report (RTR) workstream specific to Accounts Receivables, Accounts Payables, Cash Application, Asset Accounting, Collections Management, Dispute Management and Bank Integrations. Minimum 1 year of experience in SAP Solution Manager including experience in SAP CHARM process. Minimum 1 year of experience communicating technical and business issues/solutions to all levels of management. Preferred Experience: SAP Business Planning and Consolidation (BPC). Integration of SAP solution to Vertex External Tax Engine. Paymetric - credit card processing application. Strong experience in SAP BCM (Bank Communication Management). OpenText Vendor Invoice Management (VIM) solution for both PO based, and non-PO based vendor invoice solution scenarios. SAP Master Data Governance (MDG). Experience in complex SAP environments, supporting multiple SAP components, preferred. SAP certification(s) in SAP FICO functional area.
Posted 1 week ago
5.0 - 10.0 years
2 - 6 Lacs
Pune
Work from Office
Senior Associate, Business Process Management Analyst At BNY, our culture empowers you to grow and succeed. As a leading global financial services company at the center of the world s financial system we touch nearly 20% of the world s investible assets. Every day around the globe, our 50,000+ employees bring the power of their perspective to the table to create solutions with our clients that benefit businesses, communities and people everywhere. We continue to be a leader in the industry, awarded as a top home for innovators and for creating an inclusive workplace. Through our unique ideas and talents, together we help make money work for the world. This is what #LifeAtBNY is all about. We re seeking a future team member for the role of Senior Associate, Business Process Management Analyst to join our Wealth Management Data Governance team. This role is located in Pune, MH - HYBRID. In this role, you ll make an impact in the following ways: This is a unique opportunity to join Data Governance team and help be a pivotal part of Wealth Managements data transformation. The team was formed to establish robust data management and governance practices as well develop a data hub that makes information accessible and actionable to our front line business partners. The individual in this role will have the opportunity to learn industry best practices for data management, data quality, refine their data wrangling skills, learn more about the Wealth Management and serve up data driven insights to our business partners. One of the team s primary objectives is to implement a collaborative data platform for a more streamlined way of moving, transforming, analyzing, and communicating information. This will require the analyst to build relationships with key stakeholders, to work with internal clients to understand their data needs, and to partner with IT to deliver data solutions. A key tool for the team is Collibra, CDQ and DataIku. Collibra, CDQ and Dataiku provides most of the functions needed to perform the role. i.e. connections to databases, IDE functionality, SQL functions, Jupyter notebooks. SQL software is needed to connect to certain data bases. Deliver value and deliver value frequently Build and maintain relationships with key stakeholders in WM Businesses and IT Translates complex technical concepts and analyses to non-technical audiences Prepares ad-hoc reports at the request of managers and/or other leaders Build BI and machine learning prototypes and derive actionable insights for the Businesses To be successful in this role, we re seeking the following: B Tech/BE/BS Degree (stats, math, engineering degrees are a plus) 5+ years of Experience working in Data Quality and Data Management 3+ years of experience with Collibra, CDQ Excellent interpersonal and client-facing skills 2+ years of experience with SQL Good knowledge on Snowflake Passion for helping others succeed Passion for learning new skills Self-starter Dataiku experience is good to have Experience in financial industry is preferred Good knowledge of Excel Agile experience is a plus At BNY, our culture speaks for itself. Here s a few of our awards: America s Most Innovative Companies, Fortune, 2024 World s Most Admired Companies, Fortune 2024 Human Rights Campaign Foundation, Corporate Equality Index, 100% score, 2023-2024 Best Places to Work for Disability Inclusion , Disability: IN - 100% score, 2023-2024 Most Just Companies , Just Capital and CNBC, 2024 Dow Jones Sustainability Indices, Top performing company for Sustainability, 2024 Bloomberg s Gender Equality Index (GEI), 2023 Our Benefits and Rewards: BNY offers highly competitive compensation, benefits, and wellbeing programs rooted in a strong culture of excellence and our pay-for-performance philosophy. We provide access to flexible global resources and tools for your life s journey. Focus on your health, foster your personal resilience, and reach your financial goals as a valued member of our team, along with generous paid leaves, including paid volunteer time, that can support you and your family through moments that matter. BNY is an Equal Employment Opportunity/Affirmative Action Employer - Underrepresented racial and ethnic groups/Females/Individuals with Disabilities/Protected Veterans.
Posted 1 week ago
5.0 - 10.0 years
6 - 10 Lacs
Mumbai
Work from Office
Google Workspace Administrator - Procain Consulting & Services Google Workspace Administrator Google Workspace Administrator Job Code: Work Schedule: Job Summary: We are seeking an experienced Google Workspace Administrator to manage, configure, and maintain our Google Workspace environment. The ideal candidate will ensure smooth operations, optimize collaboration tools, provide end-user support, and maintain security compliance. This is a rotational shift role based out of our Mumbai, Parel office with 5 working days and 2 days off per week. Key Responsibilities: Manage and administer Google Workspace (G Suite) including Gmail, Drive, Calendar, Meet, Admin Console, and other related services. Provision and de-provision user accounts, groups, and access permissions following company policies. Monitor system performance, troubleshoot issues, and coordinate with Google support when necessary. Implement and enforce security policies, including multi-factor authentication, data loss prevention, and compliance controls. Provide technical support and training to end-users for Google Workspace applications. Manage and maintain integrations with third-party tools and applications connected to Google Workspace. Plan and execute migration projects or updates related to Google Workspace. Maintain documentation on configurations, procedures, and best practices. Collaborate with IT teams to ensure seamless communication and infrastructure support Requirements: Minimum 5 years of hands-on experience managing Google Workspace environments in a corporate setting. Strong knowledge of Google Workspace Admin Console, user lifecycle management, and security controls. Experience with Google Workspace APIs, scripting (e.g., Google Apps Script), and automation is a plus. Familiarity with cloud security best practices, identity management, and data governance. Excellent problem-solving and troubleshooting skills. Strong communication skills with the ability to train and support end users effectively. Ability to work in rotational shifts and onsite at Mumbai, Parel office Preferred Skills: Google Workspace Administrator certification or equivalent. Experience with other cloud platforms (Microsoft 365, AWS, etc.) is an advantage. Knowledge of ITIL or other IT service management frameworks. Dependable partner for all your Information Security and IT Infrastructure Management Services requirement. Address No. 12, Earthen Phoenix, Sanjeevappa Layout, 10th E cross, Nagavarapalya Main Road, C.V.Raman Nagar, Bengaluru 56 00 93
Posted 1 week ago
2.0 - 5.0 years
4 - 7 Lacs
Hyderabad, Gurugram, Bengaluru
Hybrid
Locations : Multiple location (Bangalore , Hyderabad , Chennai , kolkata , Mumbai , Pune , Gurugram) Key Skills & Responsibilities Strong hands-on experience with Snowflake database design, coding, and documentation. Expertise in performance tuning for both Oracle and Snowflake. Experience as an Apps DBA, capable of coordinating with application teams. Proficiency in using OEM, Tuning Advisor, and analyzing AWR reports. Strong SQL skills with the ability to guide application teams on improvements. Efficient management of compute and storage in Snowflake architecture. Execute administrative tasks, handle multiple Snowflake accounts, and apply best practices. Implement data governance via column-level security, dynamic masking, and RBAC. Utilize Time Travel, Cloning, replication, and recovery methods. Manage DML/DDL operations, concurrency models, and security policies. Enable secure data sharing internally and externally. Skills: documentation,apps dba,replication,dml/ddl operations,performance tunning,compute management,rbac,security policies,coding,recovery methods,dynamic masking,cloning,oem,storage management,performance tuning,secure data sharing,time travel,tuning advisor,snowflake database design,snowflake,column-level security,data governance,concurrency models,column-level security, dynamic masking, and rbac,oracle,awr reports analysis,sql.
Posted 1 week ago
13.0 - 18.0 years
30 - 37 Lacs
Gurugram
Work from Office
Why JCI https//www. youtube. com/watch?v=nrbigjbpxkg Asia-Pacific LinkedIn https//www. linkedin. com / showcase / johnson-controls-asia-pacific / posts / ?feedView=all Career The Power Behind Your Mission OpenBlue This is How a Space Comes Alive How will you do it? Solution Architecture Design Design scalable and efficient data architectures using Snowflake that meet business needs and best practices Implementation Lead the deployment of Snowflake solutions, including data ingestion, transformation, and visualization processes Data Governance & Security Ensuring compliance with global data regulations in accordance with the data strategy and cybersecurity initiatives Collaboration Work closely with data engineers, data scientists, and business stakeholders to gather requirements and provide technical guidance Optimization Monitor and optimize performance, storage, and cost of Snowflake environments, implementing best practices for data modeling and querying Integration Integrate Snowflake with other cloud services and tools (e. g. , ETL/ELT tools, BI tools, data lakes) to create seamless data workflows Documentation Create and maintain documentation for architecture designs, data models, and operational procedures Training and Support Provide training and support to teams on Snowflake usage and best practices Troubleshooting Identify and resolve issues related to Snowflake performance, security, and data integrity Stay Updated Keep abreast of Snowflake updates, new features, and industry trends to continually enhance solutions and methodologies Assist Data Architects in implementing Snowflake-based data warehouse solutions to support advanced analytics and reporting use cases What we look for? Minimum Bachelor s / Postgraduate/ Master s Degree in any stream Minimum 5 years of relevant experience as Solutions Architect, Data Architect, or similar role Knowledge of Snowflake Data warehouse and understanding the concepts of data warehousing including ELT, ETL processes and data modelling Understanding of cloud platforms (AWS, Azure, GCP) and their integration with Snowflake Competency in data preparation and/or ETL tools to build and maintain data pipelines and flows Strong knowledge of databases, stored procedures(SPs) and optimization of large data sets SQL, Power BI/Tableau is mandatory along with knowledge of any data integration tool Excellent communication and collaboration skills Strong problem-solving abilities and analytical mindset Ability to work in a fast-paced, dynamic environment Diversity & Inclusion Our dedication to diversity and inclusion starts with our values. We lead with integrity and purpose, focusing on the future and aligning with our customers vision for success. Our High-Performance Culture ensures that we have the best talent that is highly engaged and eager to innovate. Our D&I mission elevates each employee s responsibility to contribute to our culture. It s through these contributions that we ll drive the mindsets and behaviors we need to power our customers missions. You have the power. You have the voice. You have the culture in your hands
Posted 1 week ago
3.0 - 6.0 years
11 - 16 Lacs
Gurugram
Work from Office
KPMG India is looking for Assistant Manager - Data Governance to join our dynamic team and embark on a rewarding career journey Ensuring company policies are followed Optimizing profits by controlling costs Hiring, training and developing new employees Resolving customer issues to their overall satisfaction Maintaining an overall management style that follows company best practices Providing leadership and direction to all employees Ensuring product quality and availability Preparing and presenting employee reviews Working closely with the store manager to lead staff Overseeing retail inventory Assisting customers whenever necessary Organizing employee schedule Ensuring that health, safety, and security rules are followed Ensuring a consistent standard of customer service Motivating employees and ensuring a focus on the mission Maintaining merchandise and a visual plan Maintaining stores to standards, including stocking and cleaning Completing tasks assigned by the general manager accurately and efficiently Supporting store manager as needed
Posted 1 week ago
7.0 - 11.0 years
14 - 18 Lacs
Bengaluru
Work from Office
D365 F&O SCM Enterprise Asset Management (EAM) Consultant End to End implementation and support of D365FO/AX2012. Knowledge of the AOT, objects and structures. Experience of D365 FO Enterprise Asset management module implementation & Support. Knowledge of following modules - Product information management, Procurement and sourcing, Inventory Management, Sales and marketing, Quality management, Advance warehouse or warehouse management, Transport management, Manage the development and launch of new features and functionality through the entire product development lifecycle, from concept through implementation, and release Requirements gathering, configuration, solution design and test the build products. Prepare Git/Gap analysis on D365FO/AX2012 product feature. Collaborate with change management / training team to develop relevant content. Collaborate with product manager and parallel product teams to define future roadmap capability needs and prioritize roadmap capabilities with a value driven approach. Partner closely with the product manager and with the engineering team to develop the creation of the product through a well-structured Agile Scrum model and set of ceremonies (Daily Standup, Backlog Grooming, Sprint Planning, Demos, Retrospectives) where candidate is a leader, not just a participant. Manage the development and launch of new features and functionality through the entire product development lifecycle, from concept through implementation and release. Manage and execute requests from the business to add/update/delete maintenance master data Conduct quality checks and ensure correct data handling and data governance Provide prompt support to business users in responding to requests and to troubleshoot data issues Support global business projects in a data advisory and mass change execution perspective Support business partners in the adherence to rules; provide guidance and support to business on data reconciliation/clean-up Act as primary data advisor to functional business lines (includes day to day and formal guidance and training)
Posted 1 week ago
5.0 - 10.0 years
20 - 35 Lacs
Gurugram
Remote
Responsibilities: Automate evidence collection and control testing processes using scripting tools (e.g., Python, PowerShell) Develop scripts and queries to analyze datasets, identify anomalies, and validate data integrity Build and maintain automation workflows for ITGC control reviews and audit documentation Parse structured and unstructured data sources to extract relevant compliance information Coordinate and track control review activities using workflow tools (e.g., Jira, ServiceNow) Support testing and documentation of controls related to: Logical access Change management Computer operations Qualifications: Proficiency in Python and PowerShell, Power Automate for scripting, automation, and data parsing Strong experience with SQL for querying and validating data across systems Familiarity with Azure DevOps, GitHub, or similar platforms for version control and CI/CD a plus Advanced Excel skills for data manipulation and reporting Strong automation mindset with a passion for improving manual processes Analytical and detail-oriented with a focus on data accuracy and integrity Clear communicator who can explain technical logic and compliance rationale Coachable, collaborative, and eager to learn new tools and frameworks Technical Skills Strong technical scripting skills (SQL, Python, PowerShell) Proficiency in Excel Ability to reduce manual touchpoints through automation Strong analytical/process improvement mindset (engineer or analyst profile) Focused on capturing evidence , proving scripts work , and bringing high attention to detail No prior GRC experience required the priority is technical capability Exposure to Azure/Azure DevOps is a plus Familiarity with Saviynt and working across connected/disconnected systems Moving towards GitHub , so the current FTE will be focused on that migration — this role will support that lift
Posted 1 week ago
1.0 - 2.0 years
11 - 15 Lacs
Hyderabad
Work from Office
About the role We are seeking a highly skilled and experienced Data Architect to join our team. The ideal candidate will have at least 1 2 years of experience in s oftware & data engineering and a nalytics and a proven track record of designing and implementing complex data solutions. Y ou will be expected to design, create, deploy, and manage Blackbauds data architecture. This role has considerable technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence atBlackbaud. Thisindividual acts as an evangelist for proper data strategy with other teams at Blackbaud and assists with the technical direction, specifically with data, of other projects. What youll be doing Develop and direct the strategy for all aspects of Blackbauds Data and Analytics platforms, products and services Set, communicate and facilitate technical directionmore broadly for the AI Center of Excellence and collaboratively beyond the Center of Excellence Design and develop breakthrough products, services or technological advancements in the Data Intelligence space that expand our business Work alongside product management to craft technical solutions to solve customer business problems. Own the technical data governance practices and ensures data sovereignty, privacy, security and regulatory compliance. Continuously challenging the status quo of how things have been done in the past. Build data access strategy to securely democratize data and enable research, modelling, machine learning and artificial intelligence work. Help define the tools and pipeline patterns our engineers and data engineers use to transform data and support our analytics practice Work in a cross-functional team to translate business needs into data architecture solutions. Ensure data solutions are built for performance, scalability, and reliability. Mentor junior data architects and team members. Keep current on technologydistributed computing, big data concepts and architecture. Promote internally how data within Blackbaud can help change the world. What we want you to have: 1 0 + years of experience in data and advanced analytics At least 8 years of experience working on data technologies in Azure/AWS Experience building modern products and infrastructure Experience working with .Net /Java and Microservice Architecture Expertise in SQL and Python Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. Expertise in Databricks, Microsoft Fabric Strong understanding of data modeling, data warehousing, data lakes , data mesh and data products. Experience with machine learning Excellent communication and leadership skills. Able to work flexible hours as required by business priorities Ability to deliver software that meets consistent standards of quality, security and operability. Stay up to date on everything Blackbaud, follow us on Linkedin, X, Instagram, Facebook and YouTube Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane