Jobs
Interviews

946 Metadata Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 12.0 years

20 - 30 Lacs

Gurugram

Work from Office

Tableau Lead/Architect : Will be responsible for designing, developing, and implementing advanced Tableau-based data visualization solutions that provide actionable insights to business stakeholders. Role requires a deep understanding of Tableau architecture, data integration, performance optimization, and a solid background in working with large datasets. Strong understanding of business intelligence concepts, data governance, security, and user management in Tableau environments reporting, and analytics. Work closely with cross-functional teams, including business analysts, data engineers, and IT teams, to deliver scalable, high-quality business intelligence solutions. Tableau Architecture & Design: Lead the design and architecture of Tableau environments, including data sources, workbooks, dashboards, and report structures. Ensure the scalability, performance, and security of Tableau deployments across multiple environments (development, staging, production). Define and implement best practices for Tableau design, dashboard development, and data visualization to meet business requirements. Collaborate with data engineers and IT teams to design data pipelines and integrate Tableau with various data sources (e.g., SQL Server, AWS Redshift, Google BigQuery, cloud data lakes, APIs). Expertise in Tableau Server, Tableau Desktop, and Tableau Prep. Experience integrating Tableau with various data sources, including relational databases, cloud platforms (e.g., AWS, Azure), and third-party APIs. Knowledge of scripting languages like Python, R, or JavaScript for advanced data analytics and dashboard customization.

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Noida

Work from Office

Job Description for Senior NLP Research Engineer Job Title : Senior NLP Research Engineer Location : Noida Qualification : BTech in CS/ AI/Data Science or related discipline Experience : 4-6 Years About us : Metafusion is a groundbreaking analytics platform that harnesses the formidable trio of big data analytics, Computer vision and NLP to unlock the boundless potential within metadata. Through its sophisticated AI layer, Metafusion seamlessly extracts actionable insights, empowering users to effortlessly traverse vast metadata repositories. Its intuitive search engine adeptly interprets natural language queries, while the integration of computer vision technology elevates analysis by providing insights from visual data. Metafusion not only redefines the landscape of data analytics but also empowers organizations to make informed decisions and propel innovation across diverse industries. Roles and Responsibilities: We re looking for a talented and driven Senior NLP Research Engineer to join our AI/ML team. In this role, you ll design and implement advanced natural language processing models that interpret metadata and user queries across complex video datasets. You ll also contribute to the development of Generative AI-based assistants , integrated into our video management ecosystem , enhancing user experience through natural interaction and smart search. Design, develop, Train, fine-tune, and deploy LLM-based NLP models for various enterprise use cases. Experience in Designing the network from scratch according to the problem statement given. Experience in training methodologies to handle the large dataset to train the networks. Collaborate with product, engineering, and data teams to integrate NLP solutions into production-grade systems. Build and optimize custom pipelines for document intelligence, question answering, summarization, and conversational AI. Experience and understanding with state-of-the-art open-source LLMs (e.g.,GPT, LLaMA, Claude, Falcon etc.). Implement prompt engineering, retrieval-augmented generation (RAG) , and fine-tuning strategies. Ensure ethical, safe, and scalable deployment of LLM-based systems.. Ensure model reliability safeDesign, develop, Train, fine-tune, and deploy LLM-based NLP models for various enterprise use cases.ty, performance, and compliance using evaluation and monitoring frameworks. Stay updated with the latest research in GenAI and NLP, and apply relevant findings to improve current systems. Job Requirements: 4 6 years of experience in NLP, Machine Learning, or Deep Learning roles. Proven experience in working with VLMs and LLMs in real-world projects. Proficiency with tools such as Hugging Face, LangChain,LLama.cpp, TensorFlow/PyTorch . Hands-on experience with vector databases (e.g., FAISS, Pinecone, Weaviate) and RAG frameworks (Not required but appreciated). Strong programming skills in Python and some experience with ML Ops or cloud platforms (AWS/GCP/Azure). Deep understanding of NLP techniques including basics such as NER, text classification, embeddings, tokenization , Transformers , loss functions etc. Knowledge of distributed training and serving of large models. Strong problem-solving and communication skills. Our perfect candidate is someone that: Is proactive and an independent problem solver Is a constant learner. We are a fast-growing company. We want you to grow with us! Is a team player and good communicator

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Why we need this role We are looking for a detail-oriented and strategically minded Senior Analyst to join our Segmentation & Customer Onboarding team. This role is pivotal in driving analytical insights that shape customer acquisition and onboarding strategies, leveraging tools like Salesforce, Excel, and CRM platforms What you will do: Design and implement customer segmentation frameworks to support targeted onboarding strategies Optimize and manage workflows using Salesforce New Account Model for seamless onboarding Analyze performance data to recommend process improvements and enhance customer experience Collaborate with cross-functional teams to ensure accurate and efficient CRM data handling Generate reports, dashboards, and actionable insights using advanced Excel skills Utilize Dun & Bradstreet (D&B) datasets to enrich customer profiles and drive smarter segmentation Ensure data integrity by maintaining data hygiene and conducting regular Salesforce audits What we re looking for Proven experience in customer segmentation and onboarding analytics Proficiency in Salesforce, specifically the New Account Model Strong command over Microsoft Excel including pivot tables, advanced formulas, and data visualization Solid understanding of CRM platforms and data architecture Familiarity with D&B data services and their application in business analytics Excellent communication skills and stakeholder management abilities A knack for problem-solving and analytical thinking 3-5 years of experience with any gradution degree. Competencies Solving Complex Problems Interacting with People at Different Levels Prioritizing and Organizing Work Serving Customers Building and Supporting Teams Driving for Results Using Math Skills Continuous Process Improvement Data Management Data Analysis Research Reports Development Metadata Analysis Sales Tools Education A bachelor s or master s degree in business administration, marketing or a relevant field

Posted 1 week ago

Apply

4.0 - 11.0 years

16 - 18 Lacs

Bengaluru

Work from Office

EPAM Anywhere is looking for Business Analyst (Metadata) to join our dynamic team and embark on a rewarding career journey Analyze the business requirements of the organization and develop solutions to improve business processes and systems Conduct market research and data analysis to support decision-making Collaborate with cross-functional teams, including development, product management, and project management, to ensure the delivery of high-quality solutions Communicate findings and recommendations to stakeholders, including management and technical teams Develop business requirements documents, use cases, process flows, and other deliverables as needed Develop and maintain a deep understanding of the organization's products, services, and business operations Participate in the implementation and testing of solutions to ensure that they meet business requirements Continuously evaluate and improve business processes and systems Strong analytical and problem-solving skills Excellent written and verbal communication skills

Posted 1 week ago

Apply

4.0 - 11.0 years

16 - 18 Lacs

Hyderabad

Work from Office

EPAM Anywhere is looking for Business Analyst (Metadata) to join our dynamic team and embark on a rewarding career journey Analyze the business requirements of the organization and develop solutions to improve business processes and systems Conduct market research and data analysis to support decision-making Collaborate with cross-functional teams, including development, product management, and project management, to ensure the delivery of high-quality solutions Communicate findings and recommendations to stakeholders, including management and technical teams Develop business requirements documents, use cases, process flows, and other deliverables as needed Develop and maintain a deep understanding of the organization's products, services, and business operations Participate in the implementation and testing of solutions to ensure that they meet business requirements Continuously evaluate and improve business processes and systems Strong analytical and problem-solving skills Excellent written and verbal communication skills

Posted 1 week ago

Apply

3.0 - 7.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Job Description Role Overview As a Strategic Sourcing Lead, you will support and manage end-to-end procurement operations, contract lifecycle management, vendor governance, and cost optimization initiatives. You will collaborate with global stakeholders to ensure efficient sourcing processes, compliance with internal policies, and delivery of commercial value. Key Responsibilities Contract Management Maintain and update the contracts repository with appropriate metadata and parent-child linkages. Validate and ensure contract execution; coordinate issue resolution across sourcing, risk, vendor, and business teams. Support contract transitions and tool migrations. Understand and review various contractual documents (e.g., MSAs, NDAs, work orders, amendments). Collaborate with legal and compliance teams to uphold business controls. Process Enablement Tools Assist teams in locating, uploading, and retrieving documentation across internal tools. Support ad-hoc contract administration projects and tool transitions. Utilize ERP systems (preferably Netsuite) to manage full Procure-to-Pay (P2P) lifecycle: vendor onboarding, PO creation, invoice handling, and payment. Drive efficiency in purchase order/contract processing, data analysis, catalog, and vendor management. Strategic Sourcing Vendor Management Lead high-level sourcing efforts across top spend categories, including those with multi-site or global relevance. Negotiate cost structures to drive cost savings and risk mitigation. Contribute to the development of spend category strategies. Oversee vendor performance and relationship management to ensure value delivery and compliance. Analyze commercial terms to secure value for money and minimize risk exposure. Collaboration Execution Work with cross-functional teams (legal, finance, compliance, operations) across geographies. Manage multiple priorities and projects with attention to timelines and quality. Adapt to changing business needs and market dynamics with a proactive, process-driven approach. Qualifications BA/BS degree 5+ years of related experience Experience with Netsuite & Coupa (or other ERP systems) Experience working with contract documents such as Statements of Work,

Posted 1 week ago

Apply

1.0 - 6.0 years

1 - 4 Lacs

Mumbai

Work from Office

Key Responsibility Areas: Video shoots of Maruti Suzuki all channel vehicles and other products Editing videos as required and uploading on video platforms SEO Optimisation of our YouTube Channel Monetizing Syndicating Content on digital platforms such as YouTube, Reddit, Pinterest etc YouTube analytics, content programming etc Keep a good check on competitorschannel and strategize accordingly Maintaining record of uploaded videos which includes Metadata, creatives other important details Maintaining the overall health of the channel by correct channel programming and maintaining regular upload schedule Co-ordinate with the Digital Marketing team on uploading videos on other digital platforms Skills: Good communication and writing skills Curiosity to learn and perform better Team Work, Excellent work-organizing skills Self-motivated and flexible to work with negligible supervision Firm work attitude Experience: Preferably experience in handling YouTube channel for a year Tools Experience (Optional): VidIQ TubeBuddy Keyword Planner SocialBlade YTLarge Adobe Editing Software (After Effects, Premiere Pro) Location Malad If youre crazy passionate about cars, if hard work is your fuel determination is what drives you, then were looking for you. Pursue your passion for auto industry with us and grow as much and as fast as you can. We encourage innovative thinking coupled with proactive approach. At Shivam Autozone, it is all about teamwork in an energetic and cheerful environment. Come, Live the change with us!

Posted 1 week ago

Apply

3.0 - 8.0 years

11 - 12 Lacs

Bengaluru

Work from Office

Req ID: 334744 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Databricks Developer to join our team in Bangalore, Karn taka (IN-KA), India (IN). Job Duties: Pushing data domains into a massive repository Building a large data lake; highly leveraging Databricks Minimum Skills Required: 3+ years of experience in a Data Engineer or Software Engineer role Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. Experience with data pipeline and workflow management tools Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Understanding of Datawarehouse (DWH) systems, and migration from DWH to data lakes Understanding of ELT and ETL patterns and when to use each. Understanding of data models and transforming data into the models including warehousing and analytic models Build processes supporting data transformation, data structures, metadata, dependency and workload management Working knowledge of message queuing, stream processing, and highly scalable big data data stores Experience supporting and working with cross-functional teams in a dynamic environment Preferred Qualifications Experience with Azure cloud services: ADLS, ADF, ADLA, AAS Minimum Skills Required: 2 years" About NTT DATA We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client s needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, https: / / us.nttdata.com / en / contact-us . NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https: / / us.nttdata.com / en / contact-us .

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Pune

Work from Office

Welcome to Veradigm! Our Mission is to be the most trusted provider of innovative solutions that empower all stakeholders across the healthcare continuum to deliver world-class outcomes. Our Vision is a Connected Community of Health that spans continents and borders. With the largest community of clients in healthcare, Veradigm is able to deliver an integrated platform of clinical, financial, connectivity and information solutions to facilitate enhanced collaboration and exchange of critical patient information. Veradigm Welcome to Veradigm! Our Mission is to be the most trusted provider of innovative solutions that empower all stakeholders across the healthcare continuum to deliver world-class outcomes. Our Vision is a Connected Community of Health that spans continents and borders. With the largest community of clients in healthcare, Veradigm is able to deliver an integrated platform of clinical, financial, connectivity and information solutions to facilitate enhanced collaboration and exchange of critical patient information. For more information, please explore Veradigm.com . What will your job look like: The Sr Operations Support Processor position is responsible for the processing of client insurance, patient deposits, client charges, hospital batches, correspondence, discrepancy notifications and incoming courier information from clients as well as preparing paper batches to be scanned into the Records Management System. The position is also responsible for importing and indexing FTP clients as well as batch request and discrepancy processing. Processor II is in charge of data entry auditing for Laserfiche only. Must be able to organize time well, multitask and accurately decipher any discrepancies within batches and report to management. Additionally, the position supports the companys overall Operations and Client Services departments.by efficiently and effectively driving the Revenue Cycle Management process and delivering results.. Main Duties: Deposits - Separate and determine payments per Explanation of Benefits (EOB); Includes direct patient statement payments; Cash, Check, Credit Card Payments Charge Batches Separate payments per superbills attached; Order of Cash, Check, Credit Card Deposits and Charges batches - confirms all information received is balanced and correct Manages and provides cash flow sheet information to clients and company Processes electronic checks for clients as well as CHMB; Process Credit card payments Verifies deposits and communicate errors and concerns to manager; Prepares deposit information to correct banks and clients and update Cash Flow and Laserfiche information; Accurate Data entry indexing into Laserfiche fields Oversees some of the company s mailing and shipping processes; Open, terminate and/or transfer PO Boxes with USPS Processes batches for scanning in a certain order, which allows posting/billing to process as efficiently and quickly as possible Organizes Client Hospital batches in specific orders for scanning, normally involving large amounts of superbills Sorts and folds claims and preparing for outgoing mail; Couriers bag information to correct clients and office personnel Works with other employees and manager s on pulling bad scans or re-scanning to ensure we have everything correct in our Records Management System Imports and indexes FTP files received from clients; Audits Laserfiche data entry in metadata fields Must maintain daily correspondence of successful and unsuccessful uploads with all FTP clients Processes internal batch request emails; Process discrepancies both internal and external from Charge batches with missing information Maintains CHMB processing and RMS documentation (batch headers, policies, FTP protocols and manager lists) Maintains internal training documentation regarding processing batch requests and discrepancies Creates folders for new clients in Laserfiche and on the FTP site as well as create new folders annually in Laserfiche for all CHMB clients. Academic Qualifications: High School Diploma or GED (Required) An Ideal Candidate will have: 3+ years relevant work experience (Preferred) Technical: Extensive knowledge on use of email, search engine, Internet; ability to effectively use payer websites and Laserfiche; knowledge and use of Microsoft Products: Outlook, Word, Excel. Preferred experience with various billing systems, such as NextGen, Pro and Allscripts. Personal: Strong written, oral, and interpersonal communication skills; Ability to present ideas in business-friendly and user-friendly language; Highly self-motivated, self-directed, and attentive to detail; team-oriented, collaborative; ability to effectively prioritize and execute tasks in a high-pressure environment. Communication: Ability to read, analyze and interpret complex documents. Ability to respond effectively to sensitive inquiries or complaints from employees and clients. Ability to speak clearly and to make effective and persuasive arguments and presentations. Math & Reasoning: Ability to add, subtract, multiply, and divide in all units of measure, using whole numbers, common fractions, and decimals. Ability to use critical thinking skills to apply principles of logic and analytical thinking to practical problems. Extensive knowledge on use of email, search engine, Internet, ten key; ability to effectively use client credit card websites and Laserfiche; knowledge and use of Microsoft Products: Outlook, Word, Excel Strong written, oral, and interpersonal communication skills Ability to present ideas in business-friendly and user-friendly language; highly self-motivated, selfdirected, and attentive to detail; team-oriented, collaborative; Ability to effectively prioritize and execute tasks in a high-pressure environment Ability to read and comprehend moderate instructions, correspondence, and memos Ability to write straightforward correspondence Ability to effectively present information in one-on-one and small group settings to customers, clients, and other employees of the organization Ability to add, subtract, multiply, and divide in all units of measure, using whole numbers, common fractions, and decimals. Ability to apply common sense understanding to carry out detailed but uninvolved written or oral instructions Ability to deal with problems involving several concrete variables in standardized situations Work Arrangements: Work from Pune Office all 5 days. Shift Timing: 7:30 PM IST to 4:30 AM IST (US Shift) Benefits Veradigm believes in empowering our associates with the tools and flexibility to bring the best version of themselves to work. Through our generous benefits package with an emphasis on work/life balance, we give our employees the opportunity to allow their careers to flourish. Quarterly Company-Wide Recharge Days Peer-based incentive Cheer awards All in to Win bonus Program Tuition Reimbursement Program To know more about the benefits and culture at Veradigm, please visit the links mentioned below: - https: / / veradigm.com / about-veradigm / careers / benefits / https: / / veradigm.com / about-veradigm / careers / culture / Veradigm is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse and inclusive workforce. Thank you for reviewing this opportunity! Does this look like a great match for your skill setIf so, please scroll down and tell us more about yourself!

Posted 1 week ago

Apply

8.0 - 13.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Voyager (94001), India, Bangalore, Karnataka Lead Software Engineer, Data Management - Capital One Software Ever since our first credit card customer in 1994, Capital One has recognized that technology and data can enable even large companies to be innovative and personalized. As one of the first large enterprises to go all-in on the public cloud, Capital One needed to build cloud and data management tools that didn t exist in the marketplace to enable us to operate at scale in the cloud. And in 2022, we publicly announced Capital One Software and brought our first B2B software solution, Slingshot, to market. Building on Capital One s pioneering adoption of modern cloud and data capabilities, Capital One Software is helping accelerate the data management journey at scale for businesses operating in the cloud. If you think of the kind of challenges that companies face things like data publishing, data consumption, data governance, and infrastructure management we ve built tools to address these various needs along the way. Capital One Software will continue to explore where we can bring our solutions to market to help other businesses address these same needs going forward. We are seeking top tier talent to join our pioneering team and propel us towards our destination. You will be joining a team of innovative product, tech, and design leaders that tirelessly seek to question the status quo. As a Lead Software Engineer, you ll have the opportunity to be on the forefront of building this business and bring these tools to market. As a Lead Software Engineer - Data Management, you will: Help build innovative products and solutions for problems in the Data Management domain Maintain knowledge on industry innovations, trends and practices to curate a continual stream of incubated projects and create rapid product prototypes Participate in technology events to support brand awareness of the organization and to attract top talent. Basic Qualifications Bachelors Degree in Computer Science Atleast 8 years of professional software development experience (internship experience does not apply) Atleast 3 years of experience in building software solutions to problems in one of the Data Management areas listed below: Data Catalog OR Metadata Store Access Control OR Policy Enforcement Data Governance Data Lineage Data Monitoring and Alerting Data Scanning and Protection Atleast 3 years of experience in building software using at least 1 of the following: Golang, Java, Python, Rust, C++ Atleast 3 years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) Preferred Qualifications Masters Degree in Computer Science or a related field 10+ years of professional software development experience (internship experience does not apply) Experience in building a commercial Data Management product from the ground up Experience in supporting a commercial Data Management product in cloud with Enterprise client. At this time, Capital One will not sponsor a new applicant for employment authorization for this position . No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City s Fair Chance Act; Philadelphia s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. .

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru

Work from Office

Why we need this role We are looking for a detail-oriented and strategically minded Senior Analyst to join our Segmentation & Customer Onboarding team. This role is pivotal in driving analytical insights that shape customer acquisition and onboarding strategies, leveraging tools like Salesforce, Excel, and CRM platforms What you will do: • Design and implement customer segmentation frameworks to support targeted onboarding strategies • Optimize and manage workflows using Salesforce New Account Model for seamless onboarding • Analyze performance data to recommend process improvements and enhance customer experience • Collaborate with cross-functional teams to ensure accurate and efficient CRM data handling • Generate reports, dashboards, and actionable insights using advanced Excel skills • Utilize Dun & Bradstreet (D&B) datasets to enrich customer profiles and drive smarter segmentation • Ensure data integrity by maintaining data hygiene and conducting regular Salesforce audits What we’re looking for • Proven experience in customer segmentation and onboarding analytics • Proficiency in Salesforce, specifically the New Account Model • Strong command over Microsoft Excel – including pivot tables, advanced formulas, and data visualization • Solid understanding of CRM platforms and data architecture • Familiarity with D&B data services and their application in business analytics • Excellent communication skills and stakeholder management abilities • A knack for problem-solving and analytical thinking 3-5 years of experience with any gradution degree. Competencies Solving Complex Problems Interacting with People at Different Levels Prioritizing and Organizing Work Serving Customers Building and Supporting Teams Driving for Results Using Math Skills Continuous Process Improvement Data Management Data Analysis Research Reports Development Metadata Analysis Sales Tools Education A bachelor’s or master’s degree in business administration, marketing or a relevant field Roles and Responsibilities Senior Analyst, Customer OnBoarding & SalesForce Governance

Posted 1 week ago

Apply

7.0 - 12.0 years

16 - 18 Lacs

Panchkula

Work from Office

Job Description We re looking for a Senior Salesforce Developer with 4 6 years of hands-on experience to join our growing CRM team. If you re passionate about building scalable Salesforce solutions across Sales, Service, and Experience Clouds this role is for you. As a Sr. Developer, you ll design and develop custom components, drive performance improvements, and integrate third-party platforms. You ll work closely with architects and cross-functional teams to ensure timely, secure, and maintainable deployments. Key Skills Expertise in Sales Cloud, Service Cloud, and Experience Cloud. Strong proficiency in Apex (classes, triggers, batch, schedulers) and Lightning Web Components (LWC). Solid knowledge of Aura Components, Flow Builder, and transition to flow-first architecture. Good understanding of Salesforce data model, sharing rules, and security architecture. Experience with REST/SOAP APIs, OAuth 2.0, Named Credentials, and Platform Events. Skilled in SOQL/SOSL, custom metadata, custom settings, and dynamic Apex. Familiarity with DevOps tools like Copado, Gearset, Salesforce DX, and version control (Git). Proficient in declarative tools (Validation Rules, Approval Processes). Roles and Responsibilities Develop and optimize Salesforce solutions across various clouds. Build and maintain reusable components using LWC and Apex. Integrate Salesforce with external systems using REST/SOAP APIs. Ensure code quality using tools like PMD and follow best practices. Collaborate with cross-functional teams for requirement analysis and solution design. Maintain and improve data security, role hierarchy, and sharing rules. Contribute to DevOps practices for deployment and version control. Continuously update skills with new Salesforce features and releases. Life at Grazitti Share Your Profile We are always looking for the best talent to join our team

Posted 1 week ago

Apply

8.0 - 10.0 years

35 - 40 Lacs

Hyderabad

Work from Office

AutoRABIT Profile AutoRABIT is the leader in DevSecOps for SaaS platforms such as Salesforce. Its unique metadata-aware capability makes Release Management, Version Control, and Backup & Recovery complete, reliable, and effective. AutoRABIT s highly scalable framework covers the entire DevSecOps cycle, which makes it the favourite platform for companies, especially large ones who require enterprise strength and robustness in their deployment environment. AutoRABIT increases the productivity and confidence of developers which makes it a critical tool for development teams, especially large ones with complex applications. AutoRABIT has institutional funding and is well positioned for growth. Headquartered in the CA, USA and with customers worldwide, AutoRABIT is a place for bringing your creativity to the most demanding SaaS marketplace. Job Role We are seeking a highly experienced Cloud Architect with deep expertise in AWS to lead the design, security, and deployment strategies of our cloud infrastructure. The ideal candidate will architect scalable, secure, and resilient cloud solutions and enforce best practices such as CIS Benchmarks and STIG compliance. Roles & Responsibilities Design and implement secure, scalable cloud architectures using AWS services (EKS, ECS, EC2, S3, RDS, ALB, Auto Scaling, Redis, etc.). Architect and guide application hosting and deployment strategies. Establish and enforce security standards, including secure hardening guidelines, CIS benchmarks, and STIG. Collaborate with security, development, and DevOps teams to embed security throughout the infrastructure. Create and maintain Infrastructure-as-Code using Terraform. Provide thought leadership in infrastructure modernization and automation practices. Work with compliance teams to ensure regulatory requirements are met. Responsibility to adhere to set internal controls. Desired Skills and Knowledge 8+ years of experience in cloud infrastructure and architecture. Deep understanding of AWS services and networking (VPC, IAM, Security Groups, etc.). Hands-on experience with Terraform, AWS Lambda (Python3 + Boto3), and automation scripting. Experience defining and implementing cloud security best practices. Strong understanding of monitoring, threat detection (TrendMicro), and security audits. Familiarity with secure deployment and hardening practices (CIS, STIG, etc.). Excellent documentation and communication skills. Education Bachelor s in computers or any related field. Location: Hyderabad, Hybrid - 3 Days from Office Experience: 8-10 Years Compensation: 35 - 40 LPA Website: www.autorabit.com

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 7 Lacs

Pune

Work from Office

At Securonix, we re on a mission to secure the world by staying ahead of cyber threats, reinforcing all layers of our platform with AI capabilities. Our Securonix Unified Defense SIEM provides organizations with the first and only AI-Reinforced solution built with a cybersecurity mesh architecture on a highly scalable data cloud. Enhanced by Securonix EON s AI capabilities, our innovative cloud-native solution delivers a seamless CyberOps experience, empowering organizations to scale their security operations and keep up with evolving threats. Recognized as a five-time leader in the Gartner Magic Quadrant for SIEM and highly rated on Gartner Peer Insights , our award-winning Unified Defense SIEM provides organizations with 365 days of hot data for rapid search and investigation, threat content-as-a-service, proactive defense through continuous peer and partner collaboration, and a fully integrated Threat Detection, Investigation, and Response (TDIR) experience all within a single platform. Built on a cloud-native architecture, the platform leverages the Snowflake Data Cloud for unparalleled scalability and performance. Securonix is proud to be a cybersecurity unicorn and featured in CRNs 2024 Security 100 list . Backed by Vista Equity Partners , one of the largest private equity firms with over $100 billion in assets under management, we have a unique advantage in driving innovation and growth. With a global footprint, we serve more than 1,000 customers worldwide , including 10% of the Fortune 100 . Our network of 150+ partners and Managed Security Service Providers (MSSPs) enables us to deliver unmatched security solutions on a global scale. At Securonix, we are driven by our core values and place our people at the heart of everything we do: Winning as One Team : We work together with universal respect to achieve aligned outcomes Customer Driven Innovation : We innovate to stay ahead of the market and create value for our customers Agility in Action : We embrace change and are unified in our purpose and objectives amidst change Join us as we redefine cybersecurity, innovate fearlessly, and grow together as one team. Role Summary: We are looking for a savvy Data Engineer to join our growing team of analytics experts with hands-on experience to work on cutting edge technology. The hire will be responsible for expanding and optimizing our data pipeline architecture, as well as optimizing data flow and collection for cross functional teams which will impact the overall UEBA (User Entity Behavior Analytics) use cases. An ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems & building them from the ground up and will be responsible for out-of-the-box threat use case delivery. The Data Engineer will support our software developers, SIEM engineers, data analysts and work with the Threat Lab data scientists on data parsing, use case delivery and will ensure optimal delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company s data architecture to support our next generation SIEM (Security Information and Event Management). Primary Responsibility: Implement/develop out-of-the-box parsers/connectors and responsible for net new development of parsers including enhanced categorizations and use cases. Data validation program for supported data sources and responsible for quality assurance Content validation for all the out-of-the-box use cases and threat models Implementation, validation of supported dashboards / reports and net new development of custom dashboards/reports Coordinate with product management & engineering for troubleshooting connector integration issues for various products Work with data and analytics experts from Securonix Threat Labs to strive for greater functionality in our data systems and streamline supported data parsing and use case configurations Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Minimum Requirements: Strong experience in regex implementation and parser creation (must have) 3+ years or more of hands-on working experience in engineering development and SIEM solution deployment. Good amount of experience in SVN, Git or any other version control tool (must have). Intermediate working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working with a variety of databases. Experience building and optimizing big data data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with structured and unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Strong working knowledge of parser management, stream processing, and highly scalable big data data stores. Strong product management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. 1-3 years experience in the Data Engineering, who has attained a Graduate (Masters) degree in Computer Science, Information Systems or Cyber Security field OR 3+ years of experience in the Data Engineering with a Bachelors degree in Computer Science, Information Systems or Cyber Security field They should also have experience using the following software/tools: Experience with relational SQL databases Experience with object-oriented/object function scripting languages (1 of the following): Python, Java or Bash scripting. SVN / Git or any version control tool Preferred: Experience with NoSQL databases - REDIS. Experience with object-oriented/object function scripting languages (1 of the following): Python, Java, Bash. Experience with big data tools: Hadoop, Spark, Kafka, etc. CISSP / CEH certified or any certification related to SIEM / UEBA deployment Leadership certification and/or awards attained for leadership skills Working knowledge of cloud technologies such as Amazon, Azure and Google Good understanding of log collection and forwarding technologies such as Syslog-NG, rsyslog, Nxlog, Windows Event Forwarding Experience integrating endpoint security and host based intrusion detection solutions Experience with networking technologies such as Wireshark, PCAP, tcpdump Benefits - As a full-time employee with Securonix, you will be eligible for the following employee benefits: Health Insurance with a total sum insured is INR 7,50,000 Coverage: Self, Spouse, 2 kids, Dependent parents, or parents-in-law Personal Accident with total sum insured is INR 10,00,000 Term Life Insurance with a sum assured for employees is 5 times fixed base pay is covered.

Posted 1 week ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Job Title: Collibra Data Governance Specialist Location: Hyderabad Job Type: Full-Time About the Role: We are seeking a highly skilled and experienced Collibra Expert to lead and manage our enterprise-level data governance initiatives using Collibra. This role requires deep expertise in Collibra s platform, including configuration, integration, workflow development, and stakeholder engagement. The ideal candidate will be responsible for implementing and maintaining data governance frameworks, ensuring data quality, and enabling data stewardship across the organization. Key Responsibilities: Lead the end-to-end implementation and administration of the Collibra Data Intelligence Platform. Design and configure Collibra Operating Model , including domains, assets, workflows, and roles. Develop and maintain custom workflows using BPMN and Collibra Workflow Designer. Integrate Collibra with enterprise systems (e.g., Snowflake, Informatica, Tableau, Azure, SAP) using APIs and connectors. Collaborate with data stewards, data owners, and business users to define and enforce data governance policies . Implement and monitor data quality rules , lineage, and metadata management. Provide training and support to business and technical users on Collibra usage and best practices. Act as a Collibra SME and evangelist within the organization, promoting data governance maturity. Maintain documentation and ensure compliance with internal and external data governance standards. Required Skills & Qualifications: 5+ years of experience in data governance , metadata management, or data quality. 3+ years of hands-on experience with Collibra , including configuration, workflow development, and integration. Strong understanding of data governance frameworks , data stewardship, and data lifecycle management. Proficiency in Collibra APIs, BPMN, and scripting languages (e.g., Groovy, JavaScript). Experience with data cataloging, lineage, and business glossary in Collibra. Familiarity with data platforms like Snowflake, Azure, AWS, Informatica, or similar. Excellent communication and stakeholder management skills. Collibra Ranger or Solution Architect certification is a plus . Preferred Qualifications: Experience in enterprise-level deployments of Collibra. Knowledge of regulatory compliance (e.g., GDPR, HIPAA, CCPA). Background in data architecture or data engineering is a plus.

Posted 1 week ago

Apply

5.0 - 10.0 years

25 - 35 Lacs

Noida, Pune, Bengaluru

Work from Office

Description: We are seeking a proficient Data Governance Engineer to lead the development and management of robust data governance frameworks on Google Cloud Platform (GCP). The ideal candidate will bring in-depth expertise in data management, metadata frameworks, compliance, and security within cloud environments to ensure high-quality, secure, and compliant data practices aligned with organizational goals. Requirements: 4+ years of experience in data governance, data management, or data security. Hands-on experience with Google Cloud Platform (GCP) including BigQuery, Dataflow, Dataproc, and Google Data Catalog. Strong command over metadata management, data lineage, and data quality tools (e.g., Collibra, Informatica). Deep understanding of data privacy laws and compliance frameworks. Proficiency in SQL and Python for governance automation. Experience with RBAC, encryption, and data masking techniques. Familiarity with ETL/ELT pipelines and data warehouse architectures. Job Responsibilities: Develop and implement comprehensive data governance frameworks , focusing on metadata management, lineage tracking , and data quality. Define, document, and enforce data governance policies, access control mechanisms, and security standards using GCP-native services such as IAM, DLP, and KMS. Manage metadata repositories using tools like Collibra, Informatica, Alation, or Google Data Catalog. Collaborate with data engineering and analytics teams to ensure compliance with GDPR, CCPA, SOC 2, and other regulatory standards. Automate processes for data classification, monitoring, and reporting using Python and SQL. Support data stewardship initiatives including the development of data dictionaries and governance documentation. Optimize ETL/ELT pipelines and data workflows to meet governance best practices. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 1 week ago

Apply

0.0 - 1.0 years

2 - 5 Lacs

Bengaluru

Work from Office

We are seeking highly motivated MTech/MSc students with a background in environmental science, ecology, geoinformatics, or related fields to assist in biodiversity data processing and geospatial analysis for the Bengaluru region. The work supports a larger initiative on Ecosystem-based Adaptation (EbA) and climate-resilient urban planning in Bengaluru. Responsibilities Categorise data into major biodiversity classes (e.g. birds, amphibians, reptiles, insects, and plants) and further assign indicator taxa to relevant blue-green (BG) infrastructure subclasses (e.g. parks, lakes, wetlands, and urban forests). Identify native and non-native species and conduct statistical analysis Use GIS tools to visualise biodiversity patterns across Bengaluru. Provide support for conducting ground surveys on biodiversity indicators. Document methodology, metadata, and preliminary findings in a structured format. Qualifications Currently e nrolled in a Master s programme in Ecology, Environmental Science, Geography, Geoinformatics, or a related discipline. Skill Set Demonstrated interest in biodiversity, species identification, and spatial data analysis Familiarity with ecological and taxonomic datasets Experience or coursework in GIS platforms (ArcGIS Pro, QGIS) and statistical software (R, Stata) preferred Proficiency in R or Python for data cleaning and statistical analysis Skilled in Excel for data sorting and tabular analysis Strong attention to detail and experience working with large datasets Effective communication skills and ability to document analytical workflows

Posted 1 week ago

Apply

8.0 - 10.0 years

20 - 25 Lacs

Gurugram

Work from Office

Position Title: Data Engineer Position Type: Regular - Full-Time Position Location: Gurgaon Requisition ID: 37277 Position Summary Data engineers are mainly responsible for designing, building, managing, and operationalizing data pipelines to support key data and analytics use cases. They play a crucial role in constructing and maintaining a modern, scalable data platform that utilizes the full capabilities of a Lakehouse Platform. You will be a key contributor to our data-driven organization, playing a vital role in both building a modern data platform and maintaining our Enterprise Data Warehouse (EDW). You will leverage your expertise in the Lakehouse Platform to design, develop, and deploy scalable data pipelines using modern and evolving technologies. Simultaneously, you will take ownership of the EDW architecture, ensuring its performance, scalability, and alignment with evolving business needs. Your responsibilities will encompass the full data lifecycle, from ingestion and transformation to delivery of high-quality datasets that empower analytics and decision-making. Duties and responsibilities Build data pipelines using Azure Databricks: Build and maintain scalable data pipelines and workflows within the Lakehouse environment. Transform, cleanse, and aggregate data using Spark SQL or PySpark. Optimize Spark jobs for performance, cost efficiency, and reliability. Develop and manage Lakehouse tables for efficient data storage and versioning. Utilize notebooks for interactive data exploration, analysis, and development. Implement data quality checks and monitoring to ensure accuracy and reliability. Drive Automation: Implement automated data ingestion processes using functionality available in the data platform, optimizing for performance and minimizing manual intervention. Design and implement end-to-end data pipelines, incorporating transformations, data quality checks, and monitoring. Utilize CI/CD tools (Azure DevOps/GitHub Actions) to automate pipeline testing, deployment, and version control. Enterprise Data Warehouse (EDW) Management: Create and maintain data models, schemas, and documentation for the EDW. Collaborate with data analysts, data scientists and business stakeholders to gather requirements, design data marts, and provide support for reporting and analytics initiatives. Troubleshoot and resolve any issues related to data loading, transformation, or access within the EDW. Educate and train: The data engineer should be curious and knowledgeable about new data initiatives and how to address them. This includes applying their data and/or domain understanding in addressing new data requirements. They will also be responsible for proposing appropriate (and innovative) data ingestion, preparation, integration and operationalization techniques in addressing these data requirements. The data engineer will be required to train counterparts in these data pipelining and preparation techniques. Ensure compliance with data governance and security: The data engineer is responsible for ensuring that the data sets provided to users are compliant with established governance and security policies. Data engineers should work with data governance and data security teams while creating new and maintaining existing data pipelines to guarantee alignment and compliance. Qualifications Education Bachelor or masters in computer science, Information Management, Software Engineering, or equivalent work experience. Work Experience At least four years or more of working in data management disciplines including: data integration, modeling, optimization and data quality, and/or other areas directly relevant to data engineering responsibilities and tasks. At least three years of experience working in cross-functional teams and collaborating with business stakeholders in support of a departmental and/or multi-departmental data management and analytics initiative. Technical knowledge, Abilities, and skills Ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata, and workload management. The ability to work with both IT and business in integrating analytics and data science output into business processes and workflows. Strong knowledge of database programming languages and hands on experience with any RDBMS. McCain Foods is an equal opportunity employer. As a global family-owned company, we strive to be the employer of choice in the diverse communities around the world in which we live and work. We recognize that inclusion drives our creativity, resilience, and success and makes our business stronger. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, sex, age, veteran status, disability, or any other protected characteristic under applicable law. McCain is an accessible employer. If you require an accommodation throughout the recruitment process (including alternate formats of materials or accessible meeting rooms), please let us know and we will work with you to find appropriate solutions. Your privacy is important to us. By submitting personal data or information to us, you agree this will be handled in accordance with McCain s Global Privacy Policy and Global Employee Privacy Policy , as applicable. You can understand how your personal information is being handled here . Job Family: Information Technology Division: Global Digital Technology Department: Global Data and Analytics Location(s): IN - India : Haryana : Gurgaon Company: McCain Foods(India) P Ltd "

Posted 1 week ago

Apply

8.0 - 13.0 years

40 - 45 Lacs

Hyderabad

Work from Office

About the job: We are an innovative global healthcare company, driven by one purpose: we chase the miracles of science to improve people s lives. Our team, across some 100 countries, is dedicated to transforming the practice of medicine by working to turn the impossible into the possible. We provide potentially life-changing treatment options and life-saving vaccine protection to millions of people globally, while putting sustainability and social responsibility at the center of our ambitions. Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions that will accelerate Manufacturing & Supply performance and help bring drugs and vaccines to patients faster, to improve health and save lives. As part of the Digital M&S Foundations organization, the data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational and dimensional databases. These solutions support Manufacturing and Supply Data and Analytical products and other business interests. What you will be doing: Be responsible for the development of the conceptual, logical, and physical data models in line with the architecture and platforms strategy Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively with the M&S teams Demonstrate a strong expertise in one of the following functional business areas of M&S: Manufacturing, Quality or Supply Chain Main Responsibilities Design and implement business data models in line with data foundations strategy and standards Work with business and application/solution teams to understand requirements, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, and analytic models. Hands-on data modeling, design, configuration, and performance tuning Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills Bachelor s or master s degree in computer/data engineer technical or related experience. 8+ years of hands-on relational, dimensional, and/or analytic experience, including 5+ years of hands-on experience with data from core manufacturing and supply chain systems such as SAP, Quality Management, LIMS, MES, Planning Experience hands-on programing in SQL Experience with data warehouse (Snowflake), data lake (AWS based), and enterprise big data platforms in a pharmaceutical company. Good knowledge of metadata management, data modeling, and related tools: Snowflake, Informatica, DBT Experience with Agile Good communication, and presentation skills Why choose us Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks gender-neutral parental leave. Opportunity to work in an international environment, collaborating with diverse business teams and vendors, working in a dynamic team, and fully empowered to propose and implement innovative ideas. Pursue Progress . Discover Extraordinary . Progress doesn t happen without people people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. You can be one of those people. Chasing change, embracing new ideas and exploring all the opportunities we have to offer. Let s pursue progress. And let s discover extraordinary together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com !

Posted 1 week ago

Apply

3.0 - 8.0 years

45 - 55 Lacs

Bengaluru

Work from Office

Join us for an exciting opportunity to lead privacy product management at JPMC. As a Product Manager within the Privacy Product team at JPMorgan Chase, you will spearhead the strategic vision and development of privacy tools, ensuring adherence to privacy legislation. You will work closely with technology teams to ensure progress and effectively communicate objectives to diverse audiences. Job Responsibilities Collaborate with Product Owners to create a strategic vision for the Privacy Product. Develop and maintain product roadmaps, balancing internal and client demands. Understand broader Privacy Product impacts and maintain product integrity. Solve client problems and deliver viable solutions according to plans. Work closely with Product Team and IT partners to meet development demands. Participate actively in agile scrum ceremonies and meetings. Coordinate and execute UAT testing to ensure deliverables meet user requirements. Support integrated testing, pilot, and production validation as needed. Collaborate with the training team to produce and deliver training materials. Develop communications and presentations for executive leadership. Required Qualifications, Capabilities, and Skills Bachelors degree in computer science, business, or related field. with 3+ years of product or related experience. Experience in creating strategic roadmaps from client requirements with understanding of software development cycle and agile methodology. Proven success working with internal/external teams at various management levels. Experience with technical concepts and collaboration with technical staff. Expertise in JIRA and agile development tools. Has an understanding of PII, metadata, and data management concepts. Experience working with Agile teams. Ability to influence and communicate with stakeholders across functions. Preferred Qualifications, Capabilities, and Skills Proficiency in Microsoft Office Suite and SQL/queries. Thrives in a fast-paced, collaborative environment and has excellent analytical and problem-solving skills. Ability to work collaboratively in teams and develop meaningful relationships. Independent leader with discernment for appropriate escalation and has an ability to handle multiple priorities on tight deadlines without compromising quality. Join us for an exciting opportunity to lead privacy product management at JPMC. As a Product Manager within the Privacy Product team at JPMorgan Chase, you will spearhead the strategic vision and development of privacy tools, ensuring adherence to privacy legislation. You will work closely with technology teams to ensure progress and effectively communicate objectives to diverse audiences. Job Responsibilities Collaborate with Product Owners to create a strategic vision for the Privacy Product. Develop and maintain product roadmaps, balancing internal and client demands. Understand broader Privacy Product impacts and maintain product integrity. Solve client problems and deliver viable solutions according to plans. Work closely with Product Team and IT partners to meet development demands. Participate actively in agile scrum ceremonies and meetings. Coordinate and execute UAT testing to ensure deliverables meet user requirements. Support integrated testing, pilot, and production validation as needed. Collaborate with the training team to produce and deliver training materials. Develop communications and presentations for executive leadership. Required Qualifications, Capabilities, and Skills Bachelors degree in computer science, business, or related field. with 3+ years of product or related experience. Experience in creating strategic roadmaps from client requirements with understanding of software development cycle and agile methodology. Proven success working with internal/external teams at various management levels. Experience with technical concepts and collaboration with technical staff. Expertise in JIRA and agile development tools. Has an understanding of PII, metadata, and data management concepts. Experience working with Agile teams. Ability to influence and communicate with stakeholders across functions. Preferred Qualifications, Capabilities, and Skills Proficiency in Microsoft Office Suite and SQL/queries. Thrives in a fast-paced, collaborative environment and has excellent analytical and problem-solving skills. Ability to work collaboratively in teams and develop meaningful relationships. Independent leader with discernment for appropriate escalation and has an ability to handle multiple priorities on tight deadlines without compromising quality.

Posted 1 week ago

Apply

2.0 - 7.0 years

6 - 10 Lacs

Pune

Work from Office

Junior Associate, Solution Engineering (Salesforce Developer) Pune, India Are you excited by the opportunity of using your knowledge of testing to lead a team to successDo you view Integrations Engineering in the retail financial industry as more than just APIs, and instead see it as an opportunity to improve the Digital Banking customer experienceDo you have experience working with Salesforce technologies, solutioning and developing enhancements, implementing banking clientsIf so, why not consider joining the Western Unions new Digital Banking hub in Pune a Junior Associate, Solution Engineering Western Union powers your pursuit. We recently launched digital banks and wallets in multiple markets to enhance our customers experiences with a cutting-edge digital Ecosystem. As an Associate, you will be responsible for implementation of a service delivery model supporting assigned platforms through a predefined framework. This includes oversight and management of the Western Union Payments data analytics/business intelligence program Role Responsibilities Ensure that software functionality is implemented with a focus on code optimization and organization. Recommend improvements to existing software programs. Troubleshoot application issues and coordinate issue resolution with operations, functional, and technical teams. Work with a software development team and Service providers in a geographically distributed structure Works independently on Simple to medium to projects. Must be a problem solver with demonstrated experience in solving difficult technology challenges, with a can-do attitude Experience in developing and implementing web-based solutions. Knowledge of object-oriented principles and techniques Knowledge of logical and physical database architectures and operating systems Self-starter with ability to multi-task, prioritize, manage workload, and consistently deliver results Experience in Agile and Iterative development methodologies Strong communication skills with ability to interact with partners globally Experience in financial services and business practices Experience in capturing business reporting requirements and design/development of reports Role Requirements 2+ years of software development experience with a focus on software design & development 2+ years of experience in Salesforce, Service Cloud and force.com platform. Good to have - Java knowledge Hands-on experience working with service cloud standard features- Email-to-case, Webforms, Chat, Entitlements & milestones, Omni Channel, CTI etc. Strong foundation & understanding of web development using Html & JavaScript in Salesforce eco-system. Experience working in Apex, Lightning Framework and components (Aura and LWC), Integrations (SF REST API), Triggers, SLDS, Batch, and configurations like Flows, validation rules, Custom Metadata types, profiles, role, sharing rules, SFDX etc. on Salesforce.com platform. Having a good knowledge of Agile and Iterative development methodologies and being proficient using JIRA, Version Control Systems (e. g. GitLab) & exposure to SFDX based deployments. Having any certifications from Service Cloud Consultant/App builder/Platform Developer-1/JavaScript Developer-1 will be preferred. We make financial services accessible to humans everywhere. Join us for what s next. Just as we help our global customers prosper, we support our employees in achieving their professional aspirations. You ll have plenty of opportunities to learn new skills and build a career, as well as receive a great compensation package. If you re ready to help drive the future of financial services, it s time for Western Union. Learn more about our purpose and people at https: / / careers.westernunion.com / . Benefits You will also have access to short-term incentives, multiple health insurance options, accident and life insurance, and access to best-in-class development platforms, to name a few (https: / / careers.westernunion.com / global-benefits/). Please see the location-specific benefits below and note that your Recruiter may share additional role-specific benefits during your interview process or in an offer of employment. Your India specific benefits include: Employees Provident Fund [EPF] Gratuity Payment Public holidays Annual Leave, Sick leave, Compensatory leave, and Maternity / Paternity leave Annual Health Check up Hospitalization Insurance Coverage (Mediclaim) Group Life Insurance, Group Personal Accident Insurance Coverage, Business Travel Insurance Cab Facility Relocation Benefit Western Union values in-person collaboration, learning, and ideation whenever possible. We believe this creates value through common ways of working and supports the execution of enterprise objectives which will ultimately help us achieve our strategic goals. By connecting face-to-face, we are better able to learn from our peers, problem-solve together, and innovate. Our Hybrid Work Model categorizes each role into one of three categories. Western Union has determined the category of this role to be Hybrid. This is defined as a flexible working arrangement that enables employees to divide their time between working from home and working from an office location. The expectation is to work from the office a minimum of three days a week. We are passionate about diversity. Our commitment is to provide an inclusive culture that celebrates the unique backgrounds and perspectives of our global teams while reflecting the communities we serve. We do not discriminate based on race, color, national origin, religion, political affiliation, sex (including pregnancy), sexual orientation, gender identity, age, disability, marital status, or veteran status. The company will provide accommodation to applicants, including those with disabilities, during the recruitment process, following applicable laws. #L1-HR1 #LI-Hybrid Estimated Job Posting End Date: 07-29-2025 This application window is a good-faith estimate of the time that this posting will remain open. This posting will be promptly updated if the deadline is extended or the role is filled.

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Pune

Work from Office

Design, develop, test, and deploy scalable Salesforce applications using LWC, Apex, Visualforce, and SOQL/SOSL. Implement and manage Sales Cloud and Service Cloud features and functionalities. Develop and maintain custom Apex classes, triggers, and batch jobs. Integrate Salesforce with external systems using REST/SOAP APIs. Customize standard and custom Salesforce objects, page layouts, and workflows. Collaborate with stakeholders to gather requirements and translate them into technical solutions. Participate in code reviews and follow best practices in code quality, version control, and deployment. Troubleshoot and resolve issues in production and lower environments. Ensure data quality, integrity, and compliance with Salesforce security standards. Required Skills and Experience: 3+ years of experience in Salesforce development. Proficiency in LWC (Lightning Web Components) and Aura Components. Strong experience in Apex classes, triggers, batch processing, and custom metadata types. Solid understanding of Sales Cloud and Service Cloud architecture and capabilities. Hands-on experience integrating with third-party systems using REST and SOAP web services. Strong understanding of Salesforce data model, security model, sharing rules, and role hierarchy. Experience with Salesforce DX, Change Sets, and Version Control (Git) is a plus.

Posted 1 week ago

Apply

3.0 - 5.0 years

20 - 25 Lacs

Gurugram

Work from Office

Job Title: Associate Vice President - Data Science Work Type: Permanent Location: DLF Downtown - Gurgaon It s more than a career at NAB. It s about more meaningful work, more global opportunities and more innovation beyond boundaries . Your job is just one part of your life. When you bring your ideas, energy, and hunger for growth, you ll be recognised and rewarded for your contribution in return. You ll have our support to excel for our customers, deliver positive change for our communities and grow your career. NAB has established NAB Innovation Centre India as a centre for operations and technology excellence to support NAB deliver faster, better, and more personalized experience to customers and colleagues. At NAB India, we re ramping-up and growing at a very fast pace. Our passionate leaders recruit and develop high performing people, empowering them to deliver exceptional outcomes to make a positive difference in the lives of our customers and our communities. YOUR NEW ROLE : Interacts with product and service teams to identify questions and issues for data analysis and experiments. Develops and codes software programs, algorithms and automated processes to cleanse, integrate and evaluate large data sets from multiple disparate sources. Providing hands-on support as required in formulating a coherent cross-business approach and strategic/tactical plan for big data initiatives. Learning, adopting and leveraging data science best practice to delivery quantitative improvements to the analytics and process modelling functions. Working with massive and complex data sets from multiple sources, utilising big data tools and techniques for the purposes of analysing, providing insight and validating hypotheses. Performing deep dive analyses of experiments through reliable modelling methods that include numerous explanatory variables and covariates. Translating analytical insights into concrete, actionable recommendations for business, process or product improvements. Making recommendations for the collection of new data or the refinement of existing data sources and storage. Developing best practice guidelines for instrumentation and experimentation. WHAT YOU WILL BRING Ability to manipulate and analyse complex, high-volume, high dimensionality data and metadata from varying sources. Strong passion for empirical research and for answering hard questions with data. Expert knowledge of analysis tools and big data technologies (Map/Reduce, Hadoop, Hive, etc). Familiarity with relational/non-relational data manipulation, machine learning, and scientific statistical analysis. Ability to communicate complex quantitative analysis in a clear, precise, and actionable manner. Flexible analytical approach that allows for results at varying levels of precision. Solid understanding and experience with programming logic and various paradigms. At least 3 - 5 years experience in a data science environment (experience may be corporate, research/government or academia) coupled with tertiary qualifications to a Masters or PhD level in a relevant technical field. A diverse and inclusive workplace works better for everyone: Our goal is to foster a culture that fills us with pride, rooted in trust and respect. NAB is committed to creating a positive and supportive environment where everyone is encouraged to embrace their true, authentic selves. A diverse and inclusive workplace where our differences are celebrated, and our contributions are valued. It s a huge part of what makes NAB such a special place to be. More focus on you: We re committed to delivering a positive experience for our colleagues and a workplace you can be proud of. We support our colleagues to balance their careers and personal life through flexible working arrangements such as hybrid working and job sharing and competitive financial and lifestyle benefits. We invest in our colleagues through world class development programs (Distinctive Leadership and Career Qualified in Banking), and empower you to learn, grow and pursue exciting career opportunities Join NAB India: This is your chance to join NAB India and along with your experience and expertise to help shape an innovation driven organisation that focuses on making a positive impact in the lives of its customers, colleagues and communities To know more about us please click here To know more about NAB Global Innovation Centres please click here We re on LinkedIn: NAB Innovation Centre India

Posted 1 week ago

Apply

8.0 - 13.0 years

35 - 40 Lacs

Pune

Work from Office

Posted on: 7/22/2025 - Application Deadline: - Job available in these locations: Were looking for a Senior Principal Software Engineer -Digital Workplace This role is Office Based, Pune Office Sr. Principal Software Engineer, Digital Workplace This role is Office Based, Pune/Hyderabad, India Office Cornerstone OnDemand is looking for a Digital Solutions Engineers who will closely work with the Digital Workplace Tools (DWT) Lead to deliver solutions to enhanced digital workspace experience around: Azure AI Foundry, Azure ML and overall M365 suite of Apps including SharePoint - Document Management & Intranet as well as Power Platform. You will be responsible for designing, developing, and maintaining our SharePoint environment, as well as leveraging the Microsoft Power Platform and Azure AI to create powerful solutions for our business needs. In this role you will... Drive innovation through hands-on development of AI-powered solutions architecture, incorporating Azure OpenAI, Azure Agentic AI, Azure AI Document Intelligence, and Azure ML to create intelligent automation and data processing capabilities. Design, implement, and deliver comprehensive Microsoft 365 enterprise strategy through hands-on development, encompassing SharePoint, Teams, Power Platform, Azure, and Azure AI solutions - including COE (Center of Excellence) framework, governance models, security architecture, and integration patterns aligned with business objectives. Architect and actively develop modern intelligent workplace solutions leveraging Microsoft 365 ecosystem, focusing on scalable and sustainable intranet portals, collaboration frameworks, and enterprise content management systems. Design and implement enterprise integration patterns and reference architectures for seamless connectivity between SharePoint, Microsoft 365 suite, and third-party enterprise applications. Establish and maintain architectural standards and best practices for SharePoint security, compliance, and governance, including disaster recovery and business continuity planning through practical implementation experience. Create and implement architectural blueprints for responsive and modern SharePoint experiences, including information architecture, site hierarchies, and templates that align with organizational taxonomy. Develop technical vision and roadmap for SharePoint-based solutions, with hands-on expertise in modern development frameworks (SPFx), microservices architecture, and cloud-native patterns. Design and build scalable integration architectures utilizing Microsoft Graph API, REST services, and modern authentication protocols to enable secure and efficient cross-platform solutions. Establish and implement front-end architecture standards incorporating modern web technologies (HTML5, TypeScript, React) and responsive design patterns for optimal user experience across devices. Architect and develop comprehensive SharePoint information management solutions including metadata frameworks, content types, and taxonomy models that support enterprise search and knowledge management. Lead technical strategy sessions and provide architectural guidance to development teams, stakeholders, and clients while mentoring junior architects and senior developers. Demonstrate strong hands-on expertise across the entire Microsoft 365 and Azure stack to effectively design, develop, and deliver enterprise solutions while maintaining architectural integrity. You Have What It Takes If You Have... Requirements: Bachelor s degree in computer science or software engineering or relevant experience 8+ years of comprehensive Microsoft technology stack experience, with deep expertise in architecting M365 enterprise solutions Technical Expertise: Proven track record in designing and implementing enterprise-scale Microsoft 365 architectures, including SharePoint, Teams, Power Platform, and Azure solutions Expert-level understanding of cloud architecture patterns, focusing on Azure infrastructure design, resource optimization, and enterprise security models Deep expertise in Microsoft 365 security architecture, including advanced identity management, conditional access, data protection, and compliance frameworks Advanced proficiency in Microsoft Graph API architecture and enterprise integration patterns Expert-level experience in automation and DevOps practices using PowerShell, Azure DevOps, and Infrastructure as Code Strong architectural expertise in Azure services, including Azure AI, Cognitive Services, OpenAI implementation, and cloud-native application design Demonstrated mastery in Power Platform architecture, focusing on cross-environment ALM strategies, solution management, and governance frameworks Good to have knowledge of multi-cloud AI services integration like AWS, GCP, Snowflake Proven experience in translating business requirements into scalable enterprise architecture solutions Strong track record of leading architectural decisions and driving technical strategy across large-scale projects Extensive experience in enterprise UI/UX architecture and accessibility standards Demonstrated success in client-facing roles, with ability to communicate complex architectural concepts to diverse stakeholders Experience in architecting business-critical applications, data flows, and large-scale migration strategies Strategic mindset with ability to drive innovation while maintaining enterprise architecture standards Exceptional problem-solving abilities with focus on scalable, future-proof solutions Strong project and portfolio management experience, including risk assessment and mitigation strategies Deep understanding of enterprise security, privacy, and compliance requirements Proven ability to mentor teams and drive technical excellence Demonstrated leadership in promoting inclusive, diverse work environments and driving collaborative success Experience in establishing and maintaining architecture governance frameworks and best practices

Posted 1 week ago

Apply

2.0 - 18.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Career Category Engineering Job Description [Role Name : IS Architecture] Job Posting Title: Data Architect Workday Job Profile : Principal IS Architect Department Name: Digital, Technology & Innovation Role GCF: 06A ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles & Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 2 years of experience in Computer Science, IT or related field OR Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies