Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 years
0 Lacs
India
Remote
Wobot AI is hiring a Senior Backend Developer (Node.js + ClickHouse) to help build the data backbone of our automation and vision intelligence platform. Explore the details below and see if you’re the right fit! What you'll do: Design and implement ingestion pipelines into ClickHouse for Computer Vision and other high-volume structured insights. Model efficient, scalable schemas using MergeTree, ReplacingMergeTree, and appropriate partitioning strategies. Implement deduplication, version control, and update-safe ingestion strategies tailored for real-time and mutable data. Build and maintain backend services and APIs that expose ClickHouse data to other systems such as product dashboards and internal workflows. Collaborate with CV and backend teams to ensure seamless data flow, system integration, and ingestion resilience. Work with product and data consumers to support high-performance analytical queries and structured data access. Monitor and maintain ingestion health, performance, observability, and error handling across the pipeline. Contribute to future-facing system design that enables AI agent integration, context-aware workflows, and evolving protocols such as MCP. What we are looking for: Must Have: 4 to 6 years of backend development experience with strong proficiency in Node.js. At least 1 year of production-grade experience with ClickHouse, including schema design and performance tuning. Experience building data pipelines using RabbitMQ, Pub/Sub, or other messaging systems. Solid understanding of time-series data, analytical query patterns, and distributed ingestion design. Familiarity with Google Cloud Platform and serverless development practices. Good to have: Experience with TypeScript in production backend systems. Exposure to building serverless applications using Cloud Run or AWS Lambda. Experience working with materialized views, TTL-based retention, and ingestion optimization in ClickHouse. Prior experience with Computer Vision pipelines or real-time data flows. Awareness of modern backend patterns that support AI/ML-generated insights, structured data orchestration, and agent-based interactions. Familiarity with designing systems that could interface with evolving protocols such as MCP or context-rich feedback systems. How we work: We use Microsoft Teams for daily communication, conduct daily standups and team meetings over Teams. We value open discussion, ownership, and a founder mindset. We prioritize design, amazing UI/UX, documentation, to-do lists, and data-based decision-making. We encourage team bonding through bi-weekly town halls, destressing sessions with a certified healer, and fun company retreats twice a year. We offer a 100% remote workplace model, health insurance, top performers eligible for attractive equity options, mental health consultations, company-sponsored upskilling courses, growth hours, the chance to give back with 40 hours for community causes, and access to a financial advisor. Wobot is an Equal Opportunity Employer If you have a passion for developing innovative solutions and want to work on cutting-edge technology, we encourage you to apply for this exciting opportunity. Show more Show less
Posted 21 hours ago
6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Join our Team About this opportunity: Join our team as a Storage Engineer where you will be responsible for managing and optimizing storage solutions in both NetApp and HPE environments. Your role will encompass critical event monitoring, network interface control, and maintaining data integrity through effective storage management. This role is crucial in ensuring our storage infrastructure remains robust, efficient, and secure. What you will do: Monitor storage critical events and network interfaces. Check aggregate and volume usage using AIQUM. Configure and troubleshoot Snap Mirror and Snap Vault environments. Perform daily health checks for LUN, aggregate, and volume utilization. Manage Snap Restore and CIFS shares, and handle permissions for shared folders. Set up and troubleshoot V-Filer, NFS, CIFS, and SAN environments. Conduct ONTAP OS upgrades and manage LUN assignments. Resolve hardware issues in coordination with NetApp support. Perform volume resize, aggregate disk addition, and LUN resizing. Lun management (LUN creation, LUN snapshots, Manual and automatic igroup management, LUN restore) using Snap drive. Performed DR test for the environment. Snapshot, Lun clone, flex clone management. Setting up and troubleshooting V-Filer Environment. Setting up and troubleshooting NFS, CIFS, and SAN (FC and ISCSI) Environment. Configuring windows ISCSI Boot LUN for the environment. Managing the reports using operations manager. Working with qtree’s for efficient storage utilization for the users. Performing Deduplication for the volumes. Alias, zoning configuration for the servers connected in FC environment in brocade switches & Cisco Switches. Able to create/Expand aggregate, volume, Qtree for NAS environments, LUN’s, Igroups for SAN environments and also reclamation the LUN’s and SAN ports if any (De-commissioning projects). Provide storage infrastructure system management including capacity planning, performance monitoring and tuning, security management etc. Should Manage Tier-3 support following ITIL practice and incident management. Proven experience in complex, enterprise level NAS platform in a mission critical environment Should Leads technical assessments of hardware, software, tools, applications, firmware, middleware, and operating systems to support business operations. Strong product knowledge and troubleshooting skills on 3PAR/Primera, EVA, MSA, and nearline/StoreOnce products. Handle storage remediation tasks like HBA driver and firmware upgrades. Engage in capacity planning, performance monitoring, and tuning. Lead technical assessments and provide infrastructure support including design, planning, and project deployment. The skills you bring: Minimum 2–6 years of experience in storage engineering, with a strong focus on NetApp, HP Primera, and HPE storage systems. Willingness to work in a 24x7 operational environment with rotating shifts, including weekends and holidays, to support critical infra and ensure minimal downtime Proficiency with NDMP backups and integration with third-party products. Experience in performing disaster recovery tests and storage remediation, including HBA driver and firmware upgrades. Knowledge of HPE 3PAR/Primera, EVA/MSA, and nearline/StoreOnce products. Understanding of operating systems, virtualization, and networking. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more. Primary country and city: India (IN) || Noida Req ID: 764325 Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Client: Intra Edge Payroll: People Prime Job Description Title: ML/AI Engineer Location: Chennai Job Requirement: Machine Learning & AI Engineer specializing in semantic and image matching, entity resolution, and recommendation systems. Skilled in leveraging deep learning, NLP, and vector search technologies (CLIP, SBERT, FAISS) to build scalable and accurate product-matching solutions for e-commerce and B2B applications. Passionate about optimizing data-driven pricing strategies and enhancing customer experience through AI-powered solutions. Key Responsibilities: • Developed scalable product matching pipelines using CLIP, SBERT, MPNet, and FAISS for multi-modal (text + image) matching. • Implemented vector search & hybrid matching models combining deep learning and rule-based heuristics for high-accuracy matching. • Improved SKU normalization & deduplication for retail and B2B marketplaces, increasing match accuracy. • Designed a real-time product-matching API using Python, FastAPI for seamless integration across platforms. • Worked with PyTorch, TensorFlow, Scikit-learn, and Hugging Face Transformers to fine-tune models on large-scale product datasets. • Collaborated with business and data teams to integrate AI-driven pricing insights into dynamic pricing models. Requirements Required Skills & Qualification: • Experience: 5+ Years in Machine Learning, NLP, or Computer Vision. • Machine Learning: Strong knowledge of NLP, embedding, similarity search, clustering. • Tech Stack: Python, PyTorch/TensorFlow, FAISS, SQL, NoSQL. • Search & Retrieval: Experience with vector databases (FAISS, Pinecone, Weaviate, Elasticsearch). • Cloud & Deployment: Azure/GCP, MLOps, API development (FastAPI, Flask). • E-Commerce & Pricing: Understanding of Product matching models, search and recommendation models. Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Our client is founded in 2002 with offices in the US, India, Europe, Canada, Singapore, Costa Rica, Brazil, and the UK they got national and international scope and reach, backed by decades of experience and deep domain expertise. They specialize in Products such as AI Governance/Data Privacy and Services such as Interactive (Product, Discovery, Research, User Journey, Prototyping), Talent, Cloud (Development, Transformation, SRE, Architecture), Engineering (Web, Mobile, Strategy), Enterprise (Salesforce, ServiceNow, SAP, Oracle, Microsoft, Workday), Training (Corporate Learning Design and Development), and building offshore cost-effective captive Global Capability Centers. Job Requirement: Machine Learning & AI Engineer specializing in semantic and image matching, entity resolution, and recommendation systems. Skilled in leveraging deep learning, NLP, and vector search technologies (CLIP, SBERT, FAISS) to build scalable and accurate product-matching solutions for e-commerce and B2B applications. Passionate about optimizing data-driven pricing strategies and enhancing customer experience through AI-powered solutions. Job Title : AI/ML Engineer Key Skills : Python, AI, ML, Azure Job Locations : Chennai Experience : 5 to 8 years Education Qualification : BE/B.Tech/Any other science disciplines Work Mode : Hybrid. Employment Type : Contract. Notice Period : Immediate - 15 Days Key Responsibilities: • Developed scalable product matching pipelines using CLIP, SBERT, MPNet, and FAISS for multi-modal (text + image) matching. • Implemented vector search & hybrid matching models combining deep learning and rule-based heuristics for high-accuracy matching. • Improved SKU normalization & deduplication for retail and B2B marketplaces, increasing match accuracy. • Designed a real-time product-matching API using Python, FastAPI for seamless integration across platforms. • Worked with PyTorch, TensorFlow, Scikit-learn, and Hugging Face Transformers to fine-tune models on large-scale product datasets. • Collaborated with business and data teams to integrate AI-driven pricing insights into dynamic pricing models. Required Skills & Qualification: • Experience: 5+ Years in Machine Learning, NLP, or Computer Vision. • Machine Learning: Strong knowledge of NLP, embedding, similarity search, clustering. • Tech Stack: Python, PyTorch/TensorFlow, FAISS, SQL, NoSQL. • Search & Retrieval: Experience with vector databases (FAISS, Pinecone, Weaviate, Elasticsearch). • Cloud & Deployment: Azure/GCP, MLOps, API development (FastAPI, Flask). • E-Commerce & Pricing: Understanding of Product matching models, search and recommendation models. Show more Show less
Posted 1 day ago
0.0 - 6.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
Noida,Uttar Pradesh,India Job ID 764325 Join our Team About this opportunity: Join our team as a Storage Engineer where you will be responsible for managing and optimizing storage solutions in both NetApp and HPE environments. Your role will encompass critical event monitoring, network interface control, and maintaining data integrity through effective storage management. This role is crucial in ensuring our storage infrastructure remains robust, efficient, and secure. What you will do: Monitor storage critical events and network interfaces. Check aggregate and volume usage using AIQUM. Configure and troubleshoot Snap Mirror and Snap Vault environments. Perform daily health checks for LUN, aggregate, and volume utilization. Manage Snap Restore and CIFS shares, and handle permissions for shared folders. Set up and troubleshoot V-Filer, NFS, CIFS, and SAN environments. Conduct ONTAP OS upgrades and manage LUN assignments. Resolve hardware issues in coordination with NetApp support. Perform volume resize, aggregate disk addition, and LUN resizing. Lun management (LUN creation, LUN snapshots, Manual and automatic igroup management, LUN restore) using Snap drive. Performed DR test for the environment. Snapshot, Lun clone, flex clone management. Setting up and troubleshooting V-Filer Environment. Setting up and troubleshooting NFS, CIFS, and SAN (FC and ISCSI) Environment. Configuring windows ISCSI Boot LUN for the environment. Managing the reports using operations manager. Working with qtree’s for efficient storage utilization for the users. Performing Deduplication for the volumes. Alias, zoning configuration for the servers connected in FC environment in brocade switches & Cisco Switches. Able to create/Expand aggregate, volume, Qtree for NAS environments, LUN’s, Igroups for SAN environments and also reclamation the LUN’s and SAN ports if any (De-commissioning projects). Provide storage infrastructure system management including capacity planning, performance monitoring and tuning, security management etc. Should Manage Tier-3 support following ITIL practice and incident management. Proven experience in complex, enterprise level NAS platform in a mission critical environment Should Leads technical assessments of hardware, software, tools, applications, firmware, middleware, and operating systems to support business operations. Strong product knowledge and troubleshooting skills on 3PAR/Primera, EVA, MSA, and nearline/StoreOnce products. Handle storage remediation tasks like HBA driver and firmware upgrades. Engage in capacity planning, performance monitoring, and tuning. Lead technical assessments and provide infrastructure support including design, planning, and project deployment. The skills you bring: Minimum 2–6 years of experience in storage engineering, with a strong focus on NetApp, HP Primera, and HPE storage systems. Willingness to work in a 24x7 operational environment with rotating shifts, including weekends and holidays, to support critical infra and ensure minimal downtime Proficiency with NDMP backups and integration with third-party products. Experience in performing disaster recovery tests and storage remediation, including HBA driver and firmware upgrades. Knowledge of HPE 3PAR/Primera, EVA/MSA, and nearline/StoreOnce products. Understanding of operating systems, virtualization, and networking. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply?
Posted 1 day ago
0.0 - 3.0 years
0 Lacs
Mohali, Punjab
Remote
Job Title: Assistant Project Manager – Technical / CIR Location: Mohali, Punjab Experience Required: 3+ years in Project Management (Legal or Technical Domain) Job Type: Full-time | On-site Joining: Immediate joiners preferred About the Role We are looking for a technically proficient and detail-oriented Assistant Project Manager to support and lead Document-Based Review (DBR) projects in a fast-paced legal-tech environment. This role is ideal for professionals with strong scripting and automation skills who are comfortable working with large datasets and collaborating across legal and technical teams. Key Responsibilities Execute automation scripts for structured data extraction (including OCR from PDFs) Format and export data to Excel with logic-based structuring Use Python, MySQL, and Excel for data deduplication and cleanup Perform data grouping, classification, and integrity checks Automate daily team productivity reporting Collaborate closely with both legal and technical teams to ensure project accuracy and efficiency Requirements 1 to 3 years of experience in legal support, data operations, or technical project management Strong command of Python, MySQL, Microsoft Excel, OCR tools, and SharePoint Excellent communication, analytical, and problem-solving abilities Must be based in or willing to relocate to Mohali/Chandigarh Must be available for full-time, on-site work (remote work not available) How to Apply: Interested candidates may send their resume to rashika@huntingcherry.com Job Type: Full-time Pay: ₹1,500,000.00 - ₹2,000,000.00 per year Work Location: In person
Posted 2 days ago
12.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Greetings from TATA Consultancy Services Job Openings at TCS Skill :BACKUP ADMIN Exp range :12 to 18 years+ YEARS Role : Permanent Role Job location :CHENNAI/HYDERABAD/BANGALORE Current location : Anywhere In India Interview date :18th Jun 25(WEDNESDAY) 10:00 AM to 12:00 PM IST/ 30 MINS Interview mode : MS Teams Pls find the Job Description below. Architecture & Design Commvault Environment & Data center Administration • Planning & Design the Backup Network architecture (for Cloud and/or on prim) • Planning and provided the Backup solution for New Customer • Backup DR setup for New customer • Tape Library Implementation • Tape Drive setup • Tape Labeling Setup for Different Customer • Iron Mountain Account setup for All the region • SAN configuration from Commvault depends upon FRONT END DATA • Commvault License Procurement • Commvault Vendor Setup with different Regions • Migrate Backed up Data from One Storage to Another Storage from Commvault • Migrate De-duplication data from one Disk to other Disks • Physical Hardware Implementation (Network, Storage, Compute etc.) • RACK implementation • DC Cabling and Patch Panel Implementation Backup Operation Activity • COMM server Installation with Different Customers with DR SETUP (Stand-BY COMM server) • Setup Global Deduplication Policies & Storage Policies • Client Module installation and configuration and scheduling as per Customer requirement. • Building Media agent as per requirement from Customer • Managing User Access to Different Customers • Monitoring Backups and Performance from Commvault • Troubleshooting the performance issues • Disk space management on the media agent and handling dedup backup • Retention policy management via Primary/secondary copy policy. • Vault tracker policy management (tape handling) • Tape Library & Drive configuration • Drive Slot Assigning • Setting Up (Installation and configuration) and managing DB Backups (RMAN, SQL, SYBASE, Exchange) and NDMP Backup. • Setting up Backup in SQL Cluster Environment. • Setting Up and maintain Backup In Oracle RAC Server. • Setting Up VMWARE snapshot backup On Commvault. • Managing VMWARE Snapshot Backups and Restore operations. • Backup using Snap-Protect operations In VNX/Clarion. • Media resource management (Media Agent, Disk and Tape Library, Media) • Managing Firewall and communication issues efficiently. • Managing capacity license and troubleshooting license issues • Monitoring Drives and Devices. • Configuration/de-activation of backup policies for new and de-commissioned clients. • Configuring tape drives/ tape drive cleaning and stuck media removal. • Troubleshooting failed backups within SLA timeline and re- run the backups • Perform the restores for System state, Exchange servers, databases and File System restores for Windows and UNIX servers. · Perform 1 Touch Restores • Upgrading service pack for all the clients in a timely manner. • Commvault Version upgrade for backup server and clients in timely manner • Handling various backup failures and logging case with Commvault vendor • DR plan execution • Troubleshooting Deduplication database issues • Pro-actively monitor for the slow backups, Hung jobs, long queued Jobs in the environment and take necessary action. • Strive for customer satisfaction while doing the restores. Keeping the tickets with updated status of with all the activities, take care of SLA of all the tickets and Update the customer regularly about the work we are doing on the respective restore tickets. • Collaborating with other teams, users, clients onsite/offsite facilities and vendors like Iron Mountain, Symantec, HP etc. • Send reports to top management after analyzing the performance of critical servers Thanks & Regards Priyanka Talent Acquisition Group Tata Consultancy Services Show more Show less
Posted 2 days ago
2.0 years
0 - 0 Lacs
India
On-site
We are looking for a results-driven Email Marketer to lead and execute targeted email campaigns that drive engagement, conversions, and long-term customer relationships. This role will suit someone who thrives in a fast-paced marketing environment and is passionate about performance-based outreach and personalisation. You will be responsible for planning, creating, testing, and analysing outbound email campaigns across multiple segments. A strong grasp of customer segmentation, performance metrics, and creative messaging is essential. Responsibilities · Plan, write, and send targeted email campaigns to cold and warm lists using advanced tools and sequences · Maintain and optimise email deliverability, sender reputation, and open/click-through rates · Monitor and report on campaign performance metrics (CTR, open rate, bounce rate, conversions) · Continuously test email templates, subject lines, timing, and audience segments for maximum ROI · Conduct detailed data mining to build accurate and qualified prospect lists · Maintain database hygiene, deduplication, and proper tagging · Work closely with the sales and content teams to align messaging and lead generation objectives · Ensure all emails are compliant with GDPR and email marketing best practices Bonus Points · Previous hands-on experience using Apollo.io for outbound campaigns and contact discovery · Proven record of driving lead generation through cold outreach · Experience in scraping, filtering, and segmenting large datasets for targeting Skills and Requirements · 2+ years experience in email marketing, outreach, or B2B lead generation · Strong copywriting skills tailored for direct response and personalisation · Proficiency with email marketing platforms and automation tools · Excellent attention to detail and analytical skills · Familiarity with CRMs and list-building tools · Understanding of GDPR and international email compliance standards Why Join Us · Competitive salary with performance-based growth · Opportunity to work in a flexible, digital-first team · Be part of a results-oriented culture where your ideas drive impact · Access to premium tools and ongoing professional development Job Types: Full-time, Permanent Pay: ₹40,000.00 - ₹80,000.00 per month Benefits: Paid sick time Paid time off Schedule: UK shift Work Location: In person
Posted 2 days ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Database Architect Location: Hyderabad Experience: 10+ Years Employment Type: Full time Job Summary: We are seeking a seasoned Database Architect with a strong background in Google SQL , data pipeline optimization , and data visualization to lead database architecture, data analytics, and insights generation efforts. This role requires deep technical expertise, strong analytical capabilities, and leadership skills to support high-volume, time-sensitive data operations and strategic decision-making for business and product teams. Primary Skills Required: Expertise in Google SQL, F1 Query, and PLX workflows. Strong command of SQL and Advanced SQL optimization for large-scale, time-sensitive datasets. Proficiency in data visualization tools: Looker, Looker Studio, and Google Sheets. Hands-on experience in data pipeline design, transformation, and automation. Proven ability to generate business insights through data analytics, wrangling, deduplication, and reporting. Proficient coding skills in SQL, with working knowledge of Java, C, or C++. Experience with GCP Cloud infrastructure and services. Exceptional oral and written communication skills, and ability to lead data initiatives across functions. Show more Show less
Posted 2 days ago
4.0 years
0 Lacs
India
On-site
We are seeking a skilled and motivated Salesforce Developer to design, develop, and implement customized solutions within the Salesforce platform. The ideal candidate will have deep knowledge of Salesforce development tools, APIs, and integration practices to support business goals, improve user experience, and enhance CRM capabilities. Key Responsibilities: Design and develop customized Salesforce applications and solutions using Apex, Visualforce, Lightning Components, and Flows Strong understanding of REST API integration and be able to build scalable, secure, and high-performance solutions to support customer service and sales operations. Integrate Salesforce with internal and external systems (APIs, middleware, third-party platforms) Translate business requirements into well-architected technical solutions Maintain and enhance existing Salesforce applications and ensure performance, scalability, and security Participate in all phases of the software development lifecycle, including design, development, testing, and deployment Work closely with cross-functional teams (Sales, Marketing, Service, etc.) to understand CRM needs Create and maintain technical documentation, including data models, process flows, and custom code Conduct unit testing, debugging, and support for deployed applications Stay up to date with Salesforce releases and new features; provide recommendations for improvement Support data migration, cleansing, and deduplication initiatives Required Qualifications: Bachelor's degree in Computer Science, Information Systems, or a related field 4+ years of hands-on Salesforce development experience Proficiency in Apex, SOQL, SOSL, Visualforce, and Lightning Web Components (LWC) Experience with Salesforce configuration, automation (Flows/Process Builder), and deployment tools (Change Sets, SFDX) Understanding of REST/SOAP APIs and integration patterns Familiarity with version control systems like Git Salesforce Platform Developer I certification (required) Preferred Qualifications: Salesforce Platform Developer II or other certifications (Admin, Sales Cloud, Service Cloud) Experience with Agile/Scrum methodologies Knowledge of CI/CD pipelines and DevOps practices for Salesforce Experience with third-party tools such as MuleSoft, Gearset, or Jitterbit Soft Skills: Strong analytical and problem-solving skills Excellent communication and collaboration abilities Attention to detail and ability to manage multiple tasks efficiently Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Role : Manager / Sr Manager - MDM Experience : 7-12 yeatrs Job Location : Gurgaon/Noida/Bangalore/Hyderabad Your Responsibilities includes, but not limited to: Participate in overall architecture, Capacity planning, development, and implementation of Master Data Management solutions (MDM). Using MDM technologies and tools across an enterprise to enable the management and integration of master data. Understand the technical landscape current as well as desired future state Assess the current state architecture & understand current business processes for managing Master Data Management solutions. Assess the functional and non-functional requirements of desired future state MDM solution Prepare the to-be architecture including data ingestion, data quality rules, data model, match/merge, workflows, UI, batch integration and real-time services. Extensive hands-on experience in installation and configuration of core Informatica MDM Hub components such as Hub console, Hub Store, Hub Server, Cleanse/Match Server and Cleanse Adaptor. Ability to deliver full lifecycle MDM projects for clients including Data modeling, Metadata management, design and configuration of matching and merging rules, design and configuration of standardizing, cleansing and deduplication rules. Create Design Document and data models addressing business needs for the client MDM environment - Contribute to creating reusable assets and accelerators for MDM platforms. Will also be involved in integration/transfer of data across multiple systems, streamlining data processes and providing access to MDM data across the enterprise. Make technology decisions related to the Client MDM environment & Interpret requirements and architect MDM solutions. Provide subject matter expertise on data architecture and data integration implementations across various downstream system. Coordinate with Project Managers and participate in project planning and recurring meetings. Collaborate with other team members to review prototypes and develop iterative revisions. Must have Skills : 5-12 years of experience & should have hands on experience of working in MDM Projects and hands on experience in one or more MDM tools like Informatica or Reltio and has expertise in defining matching/ merging & survivor-ship rules. Should have strong commercial knowledge of key business processes & compliance requirements within Pharma Industry across multiple master data domains like Physician & Product Hands on experience in industry data quality tools like Informatica IDQ, IBM Data Quality. Must be proficient reading and understanding data models and experience working with data and databases. Strong technical experience in the areas of Master Data Management, Meta data management, Data Quality, Data Governance, Data Integration (ETL) and Data Security Experience with (all stages of MDM SDLC) planning, designing, building, deploying and maintaining scalable, highly available, mission critical enterprise-wide applications for large enterprises Should have experience in integrating MDM with Data Warehouses and Data Lakes Excellent query writing skills with Working knowledge of Oracle, SQL server, and other major databases Good knowledge of SOA/Real-time integration , Pub-Sub Model and Data Integration with Various CRM systems like Veeva, Siebel. Expertise in engaging with business users to understand the business requirements and articulate the value proposition. Should have experience working with 3rd Party Data Providers like IQVIA, SHS, Veeva etc Solid experience in configuring 3rd Party Address standardization tools Like or Tools Similar to Address Doctor, Loqate Provide subject matter expertise on data architecture and data integration implementations across various downstream systems Possesses excellent communication skills, both written and verbal, innovative presentation skills Education BE/B.Tech, MCA, M.Sc., M. Tech, MBA with 60%+ Why Axtria: - Axtria (www.Axtria.com) is truly a New-Age Software Product Unicorn, a first of its kind in providing the cloud software and data analytics to the Life Sciences industry globally. We help Life Sciences companies transform the product commercialization journey to drive sales growth and improve healthcare outcomes for patients. We are acutely aware that our work impacts millions of patients and lead passionately to improve their lives. Since our founding in 2010, technology innovation has been our winning differentiation, and we continue to leapfrog competition with platforms that deploy Artificial Intelligence and Machine Learning. Our cloud-based platforms - Axtria DataMAX ™, Axtria InsightsMAX ™, Axtria SALESIQ ™, Axtria CUSTOMERIQ ™ and Axtria MarketingIQ - enable customers to efficiently manage data, leverage data science to deliver insights for sales and marketing planning and manage end-to-end commercial operations. With customers in over 20 countries, Axtria is one of the biggest global commercial solutions providers in the Life Sciences industry. We continue to win industry recognition for growth and are featured in some of the most aspirational lists - INC 5000, Deloitte FAST 500, NJBiz FAST 50, SmartCEO Future 50, Red Herring 100, and several other growth and technology awards. Axtria is looking for exceptional talent to join our rapidly growing global team People are our biggest perk! Our transparent and collaborative culture offers a chance to work with some of the brightest minds in the industry Our data analytics and software platforms support data science, commercial operations, and cloud information management. We enable commercial excellence through our cloud-based sales planning and operations platform We are leaders in managing data using the latest cloud information management and big data technologies Axtria Institute, our in-house university, offers the best training in the industry and an opportunity to learn in a structured environment. A customized career progression plan ensures every associate is setup for success and able to do meaningful work in a fun environment. We want our legacy to be the leaders we produce for the industry 3500+ employees worldwide – growing rapidly & strengthening our product engineering team. We would almost double our India headcount in the coming year Show more Show less
Posted 3 days ago
15.0 years
0 Lacs
India
On-site
Coupa makes margins multiply through its community-generated AI and industry-leading total spend management platform for businesses large and small. Coupa AI is informed by trillions of dollars of direct and indirect spend data across a global network of 10M+ buyers and suppliers. We empower you with the ability to predict, prescribe, and automate smarter, more profitable business decisions to improve operating margins. Why join Coupa? 🔹 Pioneering Technology: At Coupa, we're at the forefront of innovation, leveraging the latest technology to empower our customers with greater efficiency and visibility in their spend. 🔹 Collaborative Culture: We value collaboration and teamwork, and our culture is driven by transparency, openness, and a shared commitment to excellence. 🔹 Global Impact: Join a company where your work has a global, measurable impact on our clients, the business, and each other. Learn more on Life at Coupa blog and hear from our employees about their experiences working at Coupa. The Impact of a Sr. Principal Software Engineer (Analytics) to Coupa: As a member of the development group, you will become part of a team that develops and maintains one of Coupa’s software products developed using Ruby and React, built as a multi-tenant SaaS solution on all Cloud Platforms like AWS, Windows Azure & GCP. We expect that you are a strong leader with extensive technical experience. You have a well-founded analytical approach to finding good solutions, a strong sense of responsibility, and excellent skills in communication and planning. You are proactive in your approach and a strong team player. What You’ll Do: Provide technical leadership across multiple software development teams by architecting scalable solutions and guiding implementation. Design and implement a high-performance, cloud-native analytics platform with API-first infrastructure for seamless data ingestion (Coupa and external spend data). Utilize AI-driven data classification to cleanse and harmonize datasets. Oversee data modeling, microservice orchestration, monitoring, and alerting. Collaborate with Engineering and Product leadership on feature design and maintenance release analysis to ensure robust customer-facing solutions. Mentor engineers, designers, and developers, while working cross-functionally with Product Management, Integrations, Services, Support, and Operations to ensure successful development and deployment What you will bring to Coupa: Bachelor’s degree in Computer Science or related field (or equivalent experience) with 15+ years developing enterprise SaaS applications using modern frameworks like Java, .Net, or C, with Python expertise. Familiar with AI/ML techniques for data cleansing, deduplication, and entity resolution, as well as MVC frameworks like Django or Rails. Full-stack experience includes building responsive UIs, SPAs, and reusable components, with strong UI/UX sensibility. Solid grasp of microservices, event-driven architecture, backend integration via APIs, and working with both relational (SQL Server, MySQL, PostgreSQL, AWS Aurora) and NoSQL databases. Skilled in performance optimization, monitoring tools, CI/CD tooling, and deployment on cloud platforms (AWS, Azure, or GCP). Bonus experience includes Kafka or similar pub-sub systems, and Redis or other caching mechanisms Coupa complies with relevant laws and regulations regarding equal opportunity and offers a welcoming and inclusive work environment. Decisions related to hiring, compensation, training, or evaluating performance are made fairly, and we provide equal employment opportunities to all qualified candidates and employees. Please be advised that inquiries or resumes from recruiters will not be accepted. By submitting your application, you acknowledge that you have read Coupa’s Privacy Policy and understand that Coupa receives/collects your application, including your personal data, for the purposes of managing Coupa's ongoing recruitment and placement activities, including for employment purposes in the event of a successful application and for notification of future job opportunities if you did not succeed the first time. You will find more details about how your application is processed, the purposes of processing, and how long we retain your application in our Privacy Policy. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary We are seeking a SAS Data Integration Developer to design, develop, and maintain Campaign Management Data Mart (CMDM) solutions, integrate multiple data sources, and ensure data quality for marketing analytics and campaign execution. Key Responsibilities Data Integration & ETL Development Develop data ingestion, transformation, and deduplication pipelines. Standardize, cleanse, and validate large-scale customer data. Work with GaussDB, SAS ESP, APIs, and SAS DI Studio for data processing. Master Data Management (CMDM) Configuration Implement unification & deduplication logic for a single customer view. Develop and manage data masking & encryption for security compliance. API & CI360 Integration Integrate CMDM with SAS CI360 for seamless campaign execution. Ensure API connectivity and data flow across platforms. Testing & Deployment Conduct Unit, Integration, and UAT Testing. Deploy CMDM solutions to production and provide knowledge transfer. Key Skills Required SAS Data Integration Studio (SAS DI Studio) Design, develop, and maintain Campaign Management Data Mart (CMDM) Data Management (SAS Base, SQL, Data Cleansing) SAS ESP, GaussDB, and API Integration Data Governance (RBAC, GDPR, PII Compliance) Data Masking & Encryption Techniques Skills:- SAS, Data integration, infomap and ETL development Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
India
Remote
RED Global - Informatica C360 Technical Expert Contract - 100% Remote - 4 Months + Extension On behalf of a global client, we're looking for an experienced Informatica C360 Cloud MDM expert to design, implement, and manage end-to-end MDM solutions. You'll work closely with business and technical teams to ensure high-quality, governed customer data across systems. Role Description: Duration – 4 Months + Extension Language – English Capacity – 5 Days/week, 8 Hours/day Location – 100% Remote – CET Time zone over lap Start – 01-07-2025 Key Responsibilities: Solution Design & Implementation: Design and configure Informatica Customer & Reference 360 (entities, hierarchies, UI, roles). Customize data models and mappings; define match, merge, survivorship rules. Data Governance & Quality: Establish governance workflows and data quality rules. Implement profiling, cleansing, deduplication, and monitoring processes. Integration & Performance: Integrate MDM with source/target systems via APIs and batch processes. Optimize MDM jobs and configurations for scalability and performance. Technical Leadership: Provide hands-on support, issue resolution, and platform configuration. Document architectures, rules, and workflows; mentor delivery teams. Qualifications: 5+ years in data management, 3+ years with Informatica IDMC (esp. Customer 360). Strong in MDM lifecycle, data modelling, governance, and integration. Familiar with Power BI, Snowflake (nice to have); Informatica certifications a plus. Excellent analytical, problem-solving, and communication skills. Please apply or send your update Cv to ryalamanchili@redglobal.com if interested and available. Show more Show less
Posted 4 days ago
4.0 years
0 Lacs
Dholera, Gujarat, India
On-site
About The Business - Tata Electronics Private Limited (TEPL) is a greenfield venture of the Tata Group with expertise in manufacturing precision components. Tata Electronics (a wholly owned subsidiary of Tata Sons Pvt. Ltd.) is building India’s first AI-enabled state-of-the-art Semiconductor Foundry. This facility will produce chips for applications such as power management IC, display drivers, microcontrollers (MCU) and high-performance computing logic, addressing the growing demand in markets such as automotive, computing and data storage, wireless communications and artificial intelligence. Tata Electronics is a subsidiary of the Tata group. The Tata Group operates in more than 100 countries across six continents, with the mission 'To improve the quality of life of the communities we serve globally, through long term stakeholder value creation based on leadership with Trust.’ Job Responsibilities - Architect and implement a scalable, offline Data Lake for structured, semi-structured, and unstructured data in an on-premises, air-gapped environment. Collaborate with Data Engineers, Factory IT, and Edge Device teams to enable seamless data ingestion and retrieval across the platform. Integrate with upstream systems like MES, SCADA, and process tools to capture high-frequency manufacturing data efficiently. Monitor and maintain system health, including compute resources, storage arrays, disk I/O, memory usage, and network throughput. Optimize Data Lake performance via partitioning, deduplication, compression (Parquet/ORC), and implementing effective indexing strategies. Select, integrate, and maintain tools like Apache Hadoop, Spark, Hive, HBase, and custom ETL pipelines suitable for offline deployment. Build custom ETL workflows for bulk and incremental data ingestion using Python, Spark, and shell scripting. Implement data governance policies covering access control, retention periods, and archival procedures with security and compliance in mind. Establish and test backup, failover, and disaster recovery protocols specifically designed for offline environments. Document architecture designs, optimization routines, job schedules, and standard operating procedures (SOPs) for platform maintenance. Conduct root cause analysis for hardware failures, system outages, or data integrity issues. Drive system scalability planning for multi-fab or multi-site future expansions. Essential Attributes (Tech-Stacks) - Hands-on experience designing and maintaining offline or air-gapped Data Lake environments. Deep understanding of Hadoop ecosystem tools: HDFS, Hive, Map-Reduce, HBase, YARN, zookeeper and Spark. Expertise in custom ETL design, large-scale batch and stream data ingestion. Strong scripting and automation capabilities using Bash and Python. Familiarity with data compression formats (ORC, Parquet) and ingestion frameworks (e.g., Flume). Working knowledge of message queues such as Kafka or RabbitMQ, with focus on integration logic. Proven experience in system performance tuning, storage efficiency, and resource optimization. Qualifications - BE/ ME in Computer science, Machine Learning, Electronics Engineering, Applied mathematics, Statistics. Desired Experience Level - 4 Years relevant experience post Bachelors 2 Years relevant experience post Masters Experience with semiconductor industry is a plus Show more Show less
Posted 4 days ago
4.0 years
0 Lacs
Dholera, Gujarat, India
On-site
About The Business - Tata Electronics Private Limited (TEPL) is a greenfield venture of the Tata Group with expertise in manufacturing precision components. Tata Electronics (a wholly owned subsidiary of Tata Sons Pvt. Ltd.) is building India’s first AI-enabled state-of-the-art Semiconductor Foundry. This facility will produce chips for applications such as power management IC, display drivers, microcontrollers (MCU) and high-performance computing logic, addressing the growing demand in markets such as automotive, computing and data storage, wireless communications and artificial intelligence. Tata Electronics is a subsidiary of the Tata group. The Tata Group operates in more than 100 countries across six continents, with the mission 'To improve the quality of life of the communities we serve globally, through long term stakeholder value creation based on leadership with Trust.’ Job Responsibilities - Architect and implement a scalable, offline Data Lake for structured, semi-structured, and unstructured data in an on-premises, air-gapped environment. Collaborate with Data Engineers, Factory IT, and Edge Device teams to enable seamless data ingestion and retrieval across the platform. Integrate with upstream systems like MES, SCADA, and process tools to capture high-frequency manufacturing data efficiently. Monitor and maintain system health, including compute resources, storage arrays, disk I/O, memory usage, and network throughput. Optimize Data Lake performance via partitioning, deduplication, compression (Parquet/ORC), and implementing effective indexing strategies. Select, integrate, and maintain tools like Apache Hadoop, Spark, Hive, HBase, and custom ETL pipelines suitable for offline deployment. Build custom ETL workflows for bulk and incremental data ingestion using Python, Spark, and shell scripting. Implement data governance policies covering access control, retention periods, and archival procedures with security and compliance in mind. Establish and test backup, failover, and disaster recovery protocols specifically designed for offline environments. Document architecture designs, optimization routines, job schedules, and standard operating procedures (SOPs) for platform maintenance. Conduct root cause analysis for hardware failures, system outages, or data integrity issues. Drive system scalability planning for multi-fab or multi-site future expansions. Essential Attributes (Tech-Stacks) - Hands-on experience designing and maintaining offline or air-gapped Data Lake environments. Deep understanding of Hadoop ecosystem tools: HDFS, Hive, Map-Reduce, HBase, YARN, zookeeper and Spark. Expertise in custom ETL design, large-scale batch and stream data ingestion. Strong scripting and automation capabilities using Bash and Python. Familiarity with data compression formats (ORC, Parquet) and ingestion frameworks (e.g., Flume). Working knowledge of message queues such as Kafka or RabbitMQ, with focus on integration logic. Proven experience in system performance tuning, storage efficiency, and resource optimization. Qualifications - BE/ ME in Computer science, Machine Learning, Electronics Engineering, Applied mathematics, Statistics. Desired Experience Level - 4 Years relevant experience post Bachelors 2 Years relevant experience post Masters Experience with semiconductor industry is a plus Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
New Delhi, Delhi, India
On-site
About the Role: We are looking for a hands-on Data Engineer to join our team and take full ownership of scraping pipelines and data quality. You'll be working on data from 60+ websites involving PDFs, processed via OCR and stored in MySQL/PostgreSQL. You’ll build robust, self-healing pipelines and fix common data issues (missing fields, duplication, formatting errors). Responsibilities: Own and optimize Airflow scraping DAGs for 60+ sites Implement validation checks, retry logic, and error alerts Build pre-processing routines to clean OCR'd text Create data normalization and deduplication workflows Maintain data integrity across MySQL and PostgreSQL Collaborate with ML team for downstream AI use cases Requirements: 2–5 years of experience in Python-based data engineering Experience with Airflow, Pandas, OCR (Tesseract or AWS Textract) Solid SQL and schema design skills (MySQL/PostgreSQL) Familiarity with CSV processing and data pipelines Bonus: Experience with scraping using Scrapy or Selenium Location: Delhi (in-office only) Salary Range : 50-80k/Month Show more Show less
Posted 4 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Customer Success Job Details About Salesforce We’re Salesforce, the Customer Company, inspiring the future of business with AI+ Data +CRM. Leading with our core values, we help companies across every industry blaze new trails and connect with customers in a whole new way. And, we empower you to be a Trailblazer, too — driving your performance and career growth, charting new paths, and improving the state of the world. If you believe in business as the greatest platform for change and in companies doing well and doing good – you’ve come to the right place. The Data Technical Consultant is a demonstrated expert in technical and/or functional aspects of customer and partner engagements that lead to the successful delivery of data management projects. The Data Architect plays a critical role for setting customers up for success by prescriptively helping to shape and then execute in the Salesforce data space. This role also provides subject matter expertise related to the data management solutions and ensures successful project delivery. Requirements 6+ years’ experience working on complex data projects including migrations, integrations, data architecture and data governance Ability to convert high-level requirements into a solution – i.e. you do not need a detailed specification to be successful Solid experience with an ETL any of the mentioned tools (ex. SSIS, Boomi, or Informatica Power Centre, Mulesoft) Experienced technology leader with extensive Data Quality, Data Management, Data Security, Data Governance skills Proficient with PL/SQL query writing and strong relational database background/understanding. Basic Understanding of DBamp. Experience on SF Data Loader and writing SOQL is an asset. Should expose himself as SME in Migration tools. Experience with BI tools like Tableau, Qlik, Salesforce Analytics is an asset Experience with master data management projects and practices is an asset. Candidates should have about at least 2 years of experience in Salesforce.com and should have a thorough understanding of the Salesforce.com project lifecycle. The Candidate should have good organizational and customer service skills so that the clients are happy with the working of the organization. A strong knowledge of enterprise analytics, big data platforms, data exchange models, CRM, MDM, and cloud integration is preferred. Data modeling experience; specifically designing logical models/data dictionaries from business requirements Applicants should be able to give Architectural/Design insight in the project if required. Implementation experience and domain knowledge in Salesforce Sales and Service functionality and any of the following multiple CRM subject areas. Excellent Mentorship and team handling skills. Excellent Troubleshooting Skills. Good understanding of Internet technologies: firewalls, web servers, web proxy servers, etc. Responsibilities Of a Data Consultant Eliciting data requirements during business solution design sessions and translating these into a solution Data modelling for custom salesforce.com projects/functionality Provide data governance thought leadership to clients so that they can manage and use their data to its fullest Conducting data audits, cleansing and de-deduplication projects to clean data Conducting analysis of customer data to identify key predictors of activity: from lead management through to forecasting Developing ETL processes for data migration and integration projects. Own and aggressively drive forward specific areas of technology architecture. Provide architectural solutions/designs to project execution teams for implementation. Lead projects within architecture. Work with Product Owner/Business Analysts to understand functional requirements and interact with other cross-functional teams to architect, design, develop, test, and release features. Team Level Supports the day-to-day progress of their team member’s career goals as set in their Career Development Plan Mentors and coaches their team members through regular one-on-ones with a primary focus on the People and secondly Project perspective Guides new team members through their onboarding plan. Motivates and inspires team members by showing appreciation and giving recognition for, not just hard work, but great work, and highlighting alignment with Traction’s values Support multiple teams with planning, scoping and creation of technical solutions for the new product capabilities, through continuous delivery to production. Liaise with team and clients for resolving technical dependencies, issues, and risks. Drive common vision, practices and capabilities across teams. Work among a team of consultants to deliver Salesforce.com to our customers for all project stages (requirements definition, planning, functional design, prototyping, development, test, issue resolution, deployment, and support). Actively Participation on Guild and motivate team on initiatives Motivate Team to use ReuseIT Traction products to existing Assets and Create reusable apps to add more values. Analyse and identify gaps in functional/business requirements and should be able to effectively communicate with both Business and Functional analysts on the same. He will manage more than 2 projects at a time and manage 2+ Junior resources and above level population and make sure their goals are aligned with company objectives. Deliver regular positive and constructive feedback to the team. Should be able to assess the impacts on technical design because of the changes in functional requirements. Serve as a mentor to the team and actively engage as part of the project on assigned client engagements to ensure timely delivery as per the best practices. Others Making sure his/her reportees aligned with company V2MOM. Ability to work under challenging environments and timelines. Willingness to learn new technologies. Ability to maintain cordial client relationship Good Communication and presentation skills Should be willing to travel Certification Requirements Desired to have Salesforce Certified Administrator (201) Desired to have Salesforce Certified Sales Cloud Consultant or Salesforce Certified Service Cloud Consultant Accommodations If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form. Posting Statement Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment. What does that mean exactly? It means that at Salesforce, we believe in equality for all. And we believe we can lead the path to equality in part by creating a workplace that’s inclusive, and free from discrimination. Know your rights: workplace discrimination is illegal. Any employee or potential employee will be assessed on the basis of merit, competence and qualifications – without regard to race, religion, color, national origin, sex, sexual orientation, gender expression or identity, transgender status, age, disability, veteran or marital status, political viewpoint, or other classifications protected by law. This policy applies to current and prospective employees, no matter where they are in their Salesforce employment journey. It also applies to recruiting, hiring, job assignment, compensation, promotion, benefits, training, assessment of job performance, discipline, termination, and everything in between. Recruiting, hiring, and promotion decisions at Salesforce are fair and based on merit. The same goes for compensation, benefits, promotions, transfers, reduction in workforce, recall, training, and education. Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Role: Data QA Lead Experience Required- 8+ Years Location- India/Remote Company Overview At Codvo.ai, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day. We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results. The Data Quality Analyst is responsible for ensuring the quality, accuracy, and consistency of data within the Customer and Loan Master Data API solution. This role will work closely with data owners, data modelers, and developers to identify and resolve data quality issues. Key Responsibilities Lead and manage end-to-end ETL/data validation activities. Design test strategy, plans, and scenarios for source-to-target validation. Build automated data validation frameworks (SQL/Python/Great Expectations). Integrate tests with CI/CD pipelines (Jenkins, Azure DevOps). Perform data integrity, transformation logic, and reconciliation checks. Collaborate with Data Engineering, Product, and DevOps teams. Drive test metrics reporting, defect triage, and root cause analysis. Mentor QA team members and ensure process adherence. Must-Have Skills 8+ years in QA with 4+ years in ETL testing. Strong SQL and database testing experience. Proficiency with ETL tools (Airbyte, DBT, Informatica, etc.). Automation using Python or similar scripting language. Solid understanding of data warehousing, SCD, deduplication. Experience with large datasets and structured/unstructured formats. Preferred Skills Knowledge of data orchestration tools (Prefect, Airflow). Familiarity with data quality/observability tools. Experience with big data systems (Spark, Hive). Hands-on with test data generation (Faker, Mockaroo). Show more Show less
Posted 5 days ago
2.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
About Us: Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. About the role: As a Vendor Risk Operations team member, you will play a critical role in safeguarding Paytm from potential risks associated with our vendor ecosystem. You will be responsible for conducting comprehensive vendor risk assessments, ensuring compliance with internal policies and regulatory requirements, and actively contributing to the continuous improvement of our vendor risk management framework. This role requires a keen eye for detail, strong analytical skills, and the ability to collaborate effectively with various stakeholders. Conduct end-to-end vendor risk assessments across various risk domains (e.g., Vendor deduplication, information security, financial stability, business continuity, regulatory compliance, data privacy). Collaborate with business units to understand their vendor requirements and associated risks. Review vendor-provided documentation, certifications, and audit reports to identify potential vulnerabilities. Conduct Mystery-shopping wherever required Track and monitor vendor remediation efforts to ensure timely closure of identified risks. Maintain accurate and up-to-date vendor risk profiles and assessment records. Assist in the development and enhancement of vendor risk assessment methodologies, tools, and processes. Contribute to the ongoing development and implementation of Paytm's vendor risk management framework. Generate regular reports on vendor risk posture and assessment progress for internal stakeholders. Participate in ad-hoc projects and initiatives related to vendor risk management as required. Expectations/Requirements: Educational Qualification: Bachelor's degree in Business Administration, Finance, IT, Risk Management, or a related field. Experience: 2-5 years of experience in vendor risk management, third-party risk management, internal audit, compliance, or a similar risk-focused role. Domain Knowledge: Strong understanding of various risk domains, including information security, data privacy (e.g., GDPR, local data protection laws), financial risk, operational risk, and regulatory compliance. Understanding of Technology and User Experience: An appreciation for how technology solutions are built and how they impact user experience will be valuable in assessing vendor capabilities and potential risks. Analytical & Problem-Solving Skills: Excellent analytical and problem-solving skills with the ability to conduct deep dives, identify, assess, and mitigate risks effectively. Advanced Knowledge of Excel is required for data analysis and reporting. Basic knowledge of MySQL would be an added advantage for data retrieval and manipulation. Communication & Interpersonal Skills: Good communication and interpersonal skills, with the ability to present complex information clearly and concisely to diverse audiences. Strong written communication for documentation and reporting. High level of drive, initiative, and self-motivation. Ability to work independently, prioritize tasks, and manage multiple assessments simultaneously in a fast-paced environment. A willingness to experiment, learn quickly, and continuously improve processes and personal skills. Certifications (Preferred but not mandatory): CISA, CRISC, CISM, or other relevant certifications in risk management or information security. Why join us: A collaborative output driven program that brings cohesiveness across businesses through technology Improve the average revenue per use by increasing the cross-sell opportunities A solid 360 feedbacks from your peer teams on your support of their goals Respect, that is earned, not demanded from your peers and manager Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description About Tripstack - We are travel tech entrepreneurs, changing the way millions of people travel. Our proprietary virtual interlining technology provides access to billions of travel itineraries by combining flights from different airline carriers that don’t traditionally work together. We take our customers from point A to B via C, at the lowest possible price. We are impacting the way people travel and provide higher margin opportunities to our partners that are some of the largest online travel agencies in the world. We pride ourselves on the performance-driven environment we have created for our teams to prosper and excel in. We come to work ready, to challenge and be challenged. We’re big enough to give our teams support but small enough that every person makes a difference. There are still plenty of challenges to champion. Requirements The Role - We are seeking an experienced data engineer to join our Data Engineering team embedded within the Data organization at TripStack. Responsibilities - Analyze and Organize Raw Data: Collect, clean, and structure raw data from diverse sources to make it suitable for further analysis and processing. Develop Robust Data Systems and Pipelines: Design and implement resilient data systems and pipelines to support efficient data processing, storage, and retrieval and establish data contracts with engineering teams. Ensure Data Meets Business Needs: Ensure that datasets are properly prepared and maintained to meet the reporting and analytics needs of the business. Prepare Data for Machine Learning: Collaborate with data scientists to prepare and process data for machine learning initiatives, ensuring compatibility and readiness for model training. Enhance Data Quality and Reliability: Implement processes and technologies to improve data quality and reliability, including data validation, cleansing, and deduplication. Collaborate on Analytics Data Flow Design: Work closely with data scientists and data architects to design and optimize the flow of analytics data, ensuring seamless integration and efficient data usage across the organization. Requirements gathering: Collaborate with cross-functional teams to gather and document business requirements for large-scale data engineering projects, ensuring clear understanding of stakeholder needs. Desired Skills & Experience - Bachelor’s degree in Computer Science or equivalent 5+ years of experience in data engineering or similar roles Proficiency in Python Experience working with structured and unstructured data Experience with big data technologies eg. Spark and Kafka and Apache Druid Strong data modeling and SQL skills Experience with orchestration tools eg. Apache Airflow, DBT, Databricks Strong cross-functional collaboration skills Nice to have - Master’s Degree in computer science, mathematics, engineering, or related discipline with 3+ years of experience Experience with MLOPs tools eg. MLFlow Airline travel industry experience is a plus Benefits What it takes to succeed here : Ambition and dedication to make a difference and change the way people travel; Where we always play to each other strength in a high performing team reaching for our common goal. We hold ourselves to the highest expectations, and move with a sense of urgency and hold ourselves accountable and win by staying true to what we believe in. What we offer : We offer an opportunity to work with a young, dynamic, and a growing team composed of high-caliber professionals. We value professionalism and promote a culture where individuals are encouraged to do more and be more. If you feel you share our passion for excellence, and growth, then look no further. We have an ambitious mission, and we need a world-class team to make it a reality. Upgrade to a First Class team! Show more Show less
Posted 5 days ago
4.0 years
0 Lacs
India
On-site
Zoho CRM Consultant Job Summary: We are seeking an experienced Senior Zoho CRM Consultant with 4+ years of hands-on experience in configuring, customizing, and managing Zoho CRM. The ideal candidate will be proficient in workflow automation, custom functions using Deluge, integrations, and analytics. This role requires a strong understanding of business processes and the ability to translate them into scalable CRM solutions that drive business efficiency. Key Responsibilities: CRM Configuration & Customization Customize CRM modules, fields, layouts, and templates to align with business requirements Configure and optimize sales pipelines Design and implement workflow automation, including approvals and notifications Develop custom functions using Deluge scripting Integrations & APIs Integrate Zoho CRM with third-party platforms such as Google Workspace, Office 365, and social media tools Implement seamless integrations within the Zoho ecosystem (Books, Desk, Campaigns, etc.) Utilize Zoho CRM APIs for custom development and advanced automation Data Management Perform data import/export for contacts, leads, and deals Ensure data quality through cleaning and deduplication practices User Management & Security Manage role-based permissions and user access Configure multi-currency and multi-language support for global usage Analytics & Reporting Build and manage custom reports and dashboards to track KPIs and performance metrics Deliver actionable insights through CRM analytics to support data-driven decision-making Required Skills & Qualifications: 4+ years of experience working with Zoho CRM Strong knowledge of Deluge scripting , automation workflows, and custom functions Proven experience with CRM integrations, APIs, and the broader Zoho suite Solid understanding of CRM data management and security best practices Strong analytical, problem-solving, and communication skills Ability to collaborate with cross-functional teams and work independently Show more Show less
Posted 5 days ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Global Sales Strategy & Operations (GSSO) is the team that helps shape Gartner's mission-critical sales priorities and works with sales leaders to drive tactical and analytical insights. As an associate on the GSSO team, you'll be at the forefront of the ongoing transformation of Gartner's sales force, which delivers approximately $4.9B in annual revenue and working to drive sustained double-digit growth. You will partner with business leaders across Gartner to support a global sales force comprised of more than 5,000 associates who sell to every major function, industry and market sector around the world. About This Role The Senior Specialist role is part of the Territory Planning & Analytics (TP&A) team in GSSO. The TP&A team is focused on designing the optimal territory investment, design and alignment strategy to maximize Sales (and Services) productivity. The Senior Specialist will join the Territory Contact & Enrichment Team in TP&A that use various tools and platforms driven by automation and analytics to empower sellers with high quality bullseye prospect contacts quickly, pro-actively, accurately and at scale in a prioritized and streamlined manner. What You Will Do Operational Excellence Actively work on completing the prospects research life cycle within timelines & provide timely and accurate output. Work on ad-hoc operational projects with minimal guidance Manage and resolve assigned tasks end to end within service level agreements. Drive operational excellence – incremental improvement in process across prospecting– reduced steps, intuitive and simpler process design/interfaces. Implement best practices/tested solutions across processes to maximize effectiveness Stakeholder Management and Collaboration Engage with stakeholders and partners with members across functions to deliver value. Partner within and across teams to Identify gaps, problem solve and improve processes. Embrace collaboration, improve ideas, and apply analytical thinking to drive impact. Perform manual workstreams to deliver the output and act as a champion mindset to provide solutions on automating manual workstreams Own and drive execution of assigned work streams independently. Project Management Ensure data integrity by identifying discrepancies and updating datasets regularly. Manage data tools and internal applications including automation tools, Excel, Power BI, prospecting tools etc. Successfully drive high complexity projects with minimal guidance Gain strong understanding of internal systems and processes such as Bulls eye, title QC, deduplication etc. Mentor and train other team members on processes/tools What You Will Need Bachelor’s degree with 3 years of relevant experience in global organization Proficiency in Microsoft Office, especially Excel and PowerPoint; Knowledge of third-party prospecting tools is a plus Excellent oral and written communication skills Ability to thrive in a fast-paced, deadline-driven, and dynamic team environment. Experience communicating complex data through relevant means to senior leaders Experience applying various analytic techniques (segmentation, regression, forecasting, etc.) is a plus Excellent interpersonal skills, a team player and quick learner What You Will Get Competitive salary, generous paid time off policy and more! India: Group Medical Insurance, Parental Leave, Employee Assistance Program (EAP) Collaborative, team-oriented culture that embraces diversity Professional development and unlimited growth opportunities #GSSO Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:100271 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less
Posted 5 days ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Global Sales Strategy & Operations (GSSO) is the team that helps shape Gartner's mission-critical sales priorities and works with sales leaders to drive tactical and analytical insights. As an associate on the GSSO team, you'll be at the forefront of the ongoing transformation of Gartner's sales force, which delivers approximately $4.9B in annual revenue and working to drive sustained double-digit growth. You will partner with business leaders across Gartner to support a global sales force comprised of more than 5,000 associates who sell to every major function, industry and market sector around the world. About this Role: The Senior Specialist role is part of the Territory Planning & Analytics (TP&A) team in GSSO. The TP&A team is focused on designing the optimal territory investment, design and alignment strategy to maximize Sales (and Services) productivity. The Senior Specialist will join the Territory Contact & Enrichment Team in TP&A that use various tools and platforms driven by automation and analytics to empower sellers with high quality bullseye prospect contacts quickly, pro-actively, accurately and at scale in a prioritized and streamlined manner. What you will do : Operational Excellence Actively work on completing the prospects research life cycle within timelines & provide timely and accurate output. Work on ad-hoc operational projects with minimal guidance Manage and resolve assigned tasks end to end within service level agreements. Drive operational excellence – incremental improvement in process across prospecting– reduced steps, intuitive and simpler process design/interfaces. Implement best practices/tested solutions across processes to maximize effectiveness Stakeholder Management and Collaboration Engage with stakeholders and partners with members across functions to deliver value. Partner within and across teams to Identify gaps, problem solve and improve processes. Embrace collaboration, improve ideas, and apply analytical thinking to drive impact. Perform manual workstreams to deliver the output and act as a champion mindset to provide solutions on automating manual workstreams Own and drive execution of assigned work streams independently. Project Management Ensure data integrity by identifying discrepancies and updating datasets regularly. Manage data tools and internal applications including automation tools, Excel, Power BI, prospecting tools etc. Successfully drive high complexity projects with minimal guidance Gain strong understanding of internal systems and processes such as Bulls eye, title QC, deduplication etc. Mentor and train other team members on processes/tools What you will need : Bachelor’s degree with 3 years of relevant experience in global organization Proficiency in Microsoft Office, especially Excel and PowerPoint; Knowledge of third-party prospecting tools is a plus Excellent oral and written communication skills Ability to thrive in a fast-paced, deadline-driven, and dynamic team environment. Experience communicating complex data through relevant means to senior leaders Experience applying various analytic techniques (segmentation, regression, forecasting, etc.) is a plus Excellent interpersonal skills, a team player and quick learner What you will get: Competitive salary, generous paid time off policy and more! India: Group Medical Insurance, Parental Leave, Employee Assistance Program (EAP) Collaborative, team-oriented culture that embraces diversity Professional development and unlimited growth opportunities #GSSO Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work. What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com. Job Requisition ID:100271 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less
Posted 6 days ago
4.0 years
0 Lacs
Bangalore Rural, Karnataka, India
On-site
Cohesity is a leader in AI-powered data security and management. Aided by an extensive ecosystem of partners, Cohesity makes it easy to secure, protect, manage, and get value from data — across the data center, edge, and cloud. Cohesity helps organizations defend against cybersecurity threats with comprehensive data security and management capabilities, including immutable backup snapshots, AI-based threat detection, monitoring for malicious behavior, and rapid recovery at scale. We’ve been named a Leader by multiple analyst firms and have been globally recognized for Innovation, Product Strength, and Simplicity in Design. Join us on our mission to shape the future of our industry. Are you ready to innovate with an industry leader? We are seeking an outstanding Engineer/ Leads who bring the experience of building LARGE distributed systems and solving sophisticated problems. Cohesity Data Platform is a limitless scale out system. It is the industry’s only hyperconverged platform crafted to consolidate ALL secondary storage and data services built on web-scale distributed architecture. Cohesity SpanFS was built to consolidate all secondary storage and eliminate legacy storage silos. It’s the only file system that combines NFS, SMB and S3 interfaces, global deduplication, and unlimited snaps and clones, on a web-scale platform. No more compromising between enterprise and cloud stacks! There will be a large variety of features to work on including: hyper-convergence, distributed data path, distributed filesystem, data across thousands of nodes, object storage, cloud services, asynchronous programming, performance optimization, software-defined infrastructure, consensus protocols, massively parallel and distributed data sets, infinite scalability, snapshots, resiliency, deduplication, compression, replication, multiple protocols, fault-tolerance, infrastructure and more that we cannot disclose yet. How You'll Spend Your Time Here As part of this core development team, you will design and build massively distributed systems at web-scale. You will be building the core backend of the Cohesity Data Platform and Cohesity SpanFS (limitless Filesystem). WE'D LOVE TO TALK TO YOU IF YOU HAVE MANY OF THE FOLLOWING: 4+ years of experience in platform developement BE/BTech degree in Computer Science with proficiency in data structures, algorithms, and software design. Master’s degree would be a plus. Having worked on and understanding of large scale engineering challenges and highly available distributed systems. Understanding of multithreading, concurrency, and parallel processing High level programming and debugging skills in C, C++, Golang or Java Familiarity in distributed storage, filesystems, object storage is a huge plus Efficient skills to solve complex problems. Experience with debugging, diagnosing, and fixing complex, production software. Work in a fast paced and agile development environment with ability to drive tasks to completion and take ownership of projects. Possess excellent communication and sharp analytical abilities Data Privacy Notice For Job Candidates For information on personal data processing, please see our Privacy Policy . In-Office Expectations Cohesity employees who are within a reasonable commute (e.g. within a forty-five (45) minute average travel time) work out of our core offices 2-3 days a week of their choosing. Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2