Jobs
Interviews

2588 Vault Jobs - Page 45

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Description We are looking for a Lead Full Stack Developer QA Manager with strong expertise in people management, leadership, software architecture, cybersecurity, and quality assurance. This role is ideal for someone with 10-14 years of experience in building, testing, securing, and optimizing scalable web applications, distributed systems, and microservices while leading and mentoring development and QA teams. If you have a passion for fostering high-performance teams, ensuring technical excellence, and driving software quality and security at scale, we would love to hear from you! Key Responsibilities Leadership & People Management Lead, mentor, and manage a team of full-stack developers and QA engineers, fostering a high performance and collaborative culture. Develop team members through coaching, technical guidance, and performance feedback. Establish and enforce coding, security, and QA best practices within the engineering organization. Drive continuous improvement in development and testing processes through automation, tooling, and process enhancements. Collaborate with cross-functional teams, including Product Management, DevOps, Security, and developers, to align engineering efforts with business goals. Define career growth plans, performance goals, and training initiatives for engineers and QA professionals. Software Development & System Architecture Architect, develop, and maintain high-performance, scalable, and secure web applications using Python and JavaScript. Lead the design and implementation of robust microservices architectures using GoF design patterns. Build and optimize APIs and backend logic using frameworks like Django, Flask, or FastAPI, ensuring adherence to OWASP security best practices. Implement secure authentication and authorization using OAuth, JWT, OpenID Connect, and RBAC. Design and manage event-driven architectures leveraging Apache Kafka for real-time, asynchronous processing. Optimize data pipelines with Apache Spark for large-scale processing and Apache Iceberg for efficient data lake management. Drive secure coding practices by proactively mitigating risks such as SQL Injection, XSS, CSRF, and Insecure Deserialization. Conduct technical reviews, troubleshoot complex system issues, and ensure scalability, reliability, and security. Quality Assurance & Automated Testing Define and own the test strategy across all development phases, including unit, integration, API, performance, and end-to-end (E2E) testing. Lead the implementation of automated test frameworks such as PyTest, Jest, Mocha, Cypress, Playwright, or Selenium. Implement TDD and BDD methodologies, ensuring security-focused software testing. Oversee API testing strategies with tools like Postman, Newman, or Karate. Conduct performance and load testing using tools like Locust, JMeter, or k6, ensuring applications meet scalability demands. Manage security vulnerability testing using OWASP ZAP, Burp Suite, or SonarQube to detect and mitigate risks. Drive SAST & DAST integration into CI/CD pipelines. Establish and enforce QA processes, including bug tracking, defect analysis, and root cause investigations. CI/CD & DevOps Integration Lead the integration of automated testing and security validation within CI/CD pipelines (Jenkins, GitHub Actions, GitLab CI/CD). Define shift-left testing strategies, catching defects and vulnerabilities early in development. Monitor and analyze application quality, security, and performance. Oversee containerization strategies using Docker, Kubernetes, and implement container security best practices. Implement and manage secrets management solutions (HashiCorp Vault, AWS Secrets Manager). Drive Infrastructure as Code (IaC) adoption to automate secure and scalable deployments. Collaboration & Cross-Functional Leadership Work closely with stakeholders to translate business requirements into scalable and secure technical solutions. Lead cross-team discussions to align development, QA, security, and operational goals. Promote a security-first, high-quality engineering culture across the organization. Provide technical thought leadership by contributing to architectural decisions, design patterns, and process improvements. Requirements Leadership & People Management Proven experience managing and mentoring development and QA teams. Strong ability to coach, develop, and retain top engineering talent. Experience establishing and enforcing engineering and QA best practices. Ability to communicate complex technical concepts to executive stakeholders and cross-functional teams. Full-Stack Development & Distributed Systems Expertise in building secure, responsive front-end applications using React, Angular, or Vue.js. Strong proficiency in Python and JavaScript, particularly with Django, Flask, or FastAPI. Deep understanding of distributed systems, microservices, design patterns, and event-driven architectures (Kafka). Strong expertise in secure API development, authentication, and authorization. Experience with SQL (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra). Expertise in secure coding principles and mitigation of OWASP Top 10 vulnerabilities. Big Data & Event-Driven Architecture Experience with Apache Kafka for real-time event-driven architectures. Proficiency in Apache Spark for distributed data processing and analytics. Understanding of Apache Iceberg for schema evolution and transactional data lakes. Quality Assurance & Test Automation Deep knowledge of automated testing frameworks such as PyTest, Jest, Mocha, Cypress, Selenium, Playwright. Expertise in TDD and BDD methodologies. Hands-on experience with security testing tools (OWASP ZAP, Burp Suite, SonarQube). Experience integrating security testing into CI/CD pipelines. Performance testing expertise with Locust, JMeter, or k6. CI/CD, DevOps & Cloud Experience managing CI/CD pipelines (Jenkins, GitHub Actions, GitLab CI/CD). Knowledge of container orchestration with Docker and Kubernetes. Hands-on experience with AWS, Azure, or GCP cloud platforms. Strong understanding of secrets management, security scanning, and compliance automation. Code Quality & Best Practices Proven ability to enforce coding standards, security best practices, and robust test automation. Strong experience in ensuring continuous testing and deployment readiness. Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, or equivalent experience. Relevant certifications (AWS, Azure, Kubernetes, CISSP, CISM, or Certified Ethical Hacker) are a plus. Job Engineering Primary Location India-Maharashtra-Mumbai Other Locations India-Maharashtra-Mumbai Schedule: Full-time Travel: No Req ID: 250653 Job Hire Type Experienced Not Applicable #BMI N/A

Posted 1 month ago

Apply

130.0 years

4 - 5 Lacs

Hyderābād

On-site

Job Description Manager Product analyst – Quality The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of the company's IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share the best practices across the Tech Centers. Role Overview: As a Sr Specialist, Product Analyst – Quality, you will be responsible for driving solution design, implementation, and continuous improvement of the different QMS tools with specific alignment to the Quality business processes. Essential skills include a strong technical as well as business background, proficiency in project management methodologies (Agile, Scrum), and excellent organizational abilities. This role is positioned within the Quality Value Team, will have advanced experience in the life sciences industry, specifically Quality Management Systems and technology landscapes; specifically, Veeva Vault Quality; will have knowledge of GxP and will play a critical role during the solution design to satisfy business needs and assuring adoptability to future system scalability. What will you do in this role: Apply a structured approach to discover, document, and manage business processes, user and stakeholder needs, including opportunity statements, use cases, insights, and requirements. Gather insight into user journeys, behavior, motivation, and pain points. Expose unarticulated problems and unmet needs. Document business process, business, and user needs in the form of problem statements to make up the backlog. Facilitate the “how” with the Development team. Gain expertise in the business area. Manage business analysis per agreed priority backlog items in JIRA. Participating in impact assessment activities, reviewing proposed changes and ensuring impact is understood. Deliver product enhancements through agreed backlog process to ensure Quality solutions evolve to meet business needs. Ensure Quality solutions remain compliant as a Validated Solution through verification testing, documentation, and validation efforts. Provide overall leadership, guidance, and management of all aspects of a given solution, including requirements gathering, enhancements delivery plan, and implementation. Initiate projects including defining a scope/charter, identifying stakeholders, and establishing governance. Act as a bridge between Business SMEs, technical teams, and non-technical stakeholders. Communicate delivery status, solution health, risks, and issues to all parties involved and ensure that everyone is aligned and informed. Conduct product status meetings and present updates to stakeholders and senior management. Evaluate delivery performance and implement continuous improvement practices. Understand the technical aspects as well as business process impacts to make informed decisions, provide guidance, and communicate effectively with the development team. This includes having a deep understanding of the QMS business processes, technology stack, architecture, and potential technical challenges. Work closely with the Product Owner to prioritize and refine the product backlog, ensuring that the team focuses on delivering the most valuable features. Identify potential risks and develop mitigation strategies. Proactively address issues that could impact project success. What Should you have: Minimum Level of Education Required : Bachelor’s Degree in Computer Science, Engineering, MIS, and Science OR in a related field. The job requires a solid academic background in how Information Technology supports the delivery of business objectives. Preferred Level of Education : Veeva Certifications (Veeva Vault/Vault Quality Suite/ QMS). The role holder has completed the Certified Vault Training and is up to date. 3+ years of experience in technical project management, with a strong understanding of project management methodologies (Agile, Scrum, Waterfall). Understanding Quality Management System Capabilities (Audit/Inspection management, CAPA management, Deviations management, Complaint management). Experience in solution delivery with GMP systems. Experience with architecture, integration, interfaces, portals, and/or analytics. Understanding of Systems Development Life Cycle (SDLC), and current Good Manufacturing Practice (cGMP) processes. Knowledge and experience with QMS relevant tools like Veeva Vault Quality and TrackWise. Proven experience leading complex technical projects in a fast-paced environment. Strong technical background with knowledge of software development, systems integration, or related areas. Excellent organizational, leadership, and decision-making skills. Strong analytical and problem-solving abilities. Effective communication and interpersonal skills to liaise with cross-functional teams. Ability to manage multiple projects simultaneously and adapt to changing priorities Who we are: We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What we look for: Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Asset Management, Benefits Management, Business, Management Process, Management System Development, Product Lifecycle, Product Management, Quality Management, Requirements Management, Social Collaboration, Stakeholder Relationship Management, Strategic Planning, System Designs Preferred Skills: Job Posting End Date: 07/24/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R350703

Posted 1 month ago

Apply

15.0 years

0 Lacs

Gurgaon

On-site

Project Role : Cloud Migration Engineer Project Role Description : Provides assessment of existing solutions and infrastructure to migrate to the cloud. Plan, deliver, and implement application and data migration with scalable, high-performance solutions using private and public cloud technologies driving next-generation business outcomes. Must have skills : Cloud Migration Planning Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Cloud Migration Engineer, you will provide assessment of existing solutions and infrastructure to migrate to the cloud. Plan, deliver, and implement application and data migration with scalable, high-performance solutions using private and public cloud technologies driving next-generation business outcomes. Roles & Responsibilities: • Hands-on deployment, configuration, and maintenance of Azure resources across multiple subscriptions and environments. • Actively monitor and troubleshoot Azure infrastructure using Azure Monitor, Log Analytics, and Application Insights to ensure optimal performance and uptime. • Implement and manage Azure Virtual Machine Scale Sets (VMSS) to support scalable and highly available applications. • Design and configure high availability solutions using Availability Zones and Availability Sets based on workload requirements. • Manage secure storage of secrets, certificates, and credentials with Azure Key Vault, including access policies and integration with Azure services. • Set up and manage Azure Site Recovery (ASR) for business continuity and disaster recovery scenarios, ensuring quick and reliable failover/failback operations. • Hands-on configuration and troubleshooting of Azure networking components, such as: o Virtual Networks (VNets) o Subnets and IP addressing o Network Security Groups (NSGs) o Azure Load Balancers (standard and basic) o Azure VPN Gateway, ExpressRoute, and Application Gateway • Automate infrastructure tasks using PowerShell, Azure CLI, and ARM templates to streamline deployment and management. • Collaborate with development, DevOps, and security teams to integrate Azure services and maintain secure, compliant cloud environments. • Perform regular health checks and optimizations of cloud resources, cost analysis, and performance tuning. • Hands-on management of backups, snapshots, and Azure Recovery Services Vault to protect critical workloads and ensure data integrity. Professional & Technical Skills: - Must To Have Skills: Proficiency in Cloud Migration Planning. - Strong understanding of cloud computing concepts. - Experience with cloud migration tools and technologies. - Knowledge of security and compliance requirements in cloud environments. - Hands-on experience in executing cloud migration projects. Additional Information: - The candidate should have a minimum of 3 years of experience in Cloud Migration Planning. - This position is based at our Gurugram office. - A 15 years full time education is required. 15 years full time education

Posted 1 month ago

Apply

15.0 years

0 Lacs

Noida

On-site

Project Role : Cloud Migration Engineer Project Role Description : Provides assessment of existing solutions and infrastructure to migrate to the cloud. Plan, deliver, and implement application and data migration with scalable, high-performance solutions using private and public cloud technologies driving next-generation business outcomes. Must have skills : Cloud Migration Planning Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Cloud Migration Engineer, you will provide assessment of existing solutions and infrastructure to migrate to the cloud. Plan, deliver, and implement application and data migration with scalable, high-performance solutions using private and public cloud technologies driving next-generation business outcomes. Roles & Responsibilities: • Hands-on deployment, configuration, and maintenance of Azure resources across multiple subscriptions and environments. • Actively monitor and troubleshoot Azure infrastructure using Azure Monitor, Log Analytics, and Application Insights to ensure optimal performance and uptime. • Implement and manage Azure Virtual Machine Scale Sets (VMSS) to support scalable and highly available applications. • Design and configure high availability solutions using Availability Zones and Availability Sets based on workload requirements. • Manage secure storage of secrets, certificates, and credentials with Azure Key Vault, including access policies and integration with Azure services. • Set up and manage Azure Site Recovery (ASR) for business continuity and disaster recovery scenarios, ensuring quick and reliable failover/failback operations. • Hands-on configuration and troubleshooting of Azure networking components, such as: o Virtual Networks (VNets) o Subnets and IP addressing o Network Security Groups (NSGs) o Azure Load Balancers (standard and basic) o Azure VPN Gateway, ExpressRoute, and Application Gateway • Automate infrastructure tasks using PowerShell, Azure CLI, and ARM templates to streamline deployment and management. • Collaborate with development, DevOps, and security teams to integrate Azure services and maintain secure, compliant cloud environments. • Perform regular health checks and optimizations of cloud resources, cost analysis, and performance tuning. • Hands-on management of backups, snapshots, and Azure Recovery Services Vault to protect critical workloads and ensure data integrity. Professional & Technical Skills: - Must To Have Skills: Proficiency in Cloud Migration Planning. - Strong understanding of cloud computing concepts. - Experience with cloud migration tools and technologies. - Knowledge of security and compliance requirements in cloud environments. - Hands-on experience in executing cloud migration projects. Additional Information: - The candidate should have a minimum of 3 years of experience in Cloud Migration Planning. - This position is based at our Gurugram office. - A 15 years full time education is required. 15 years full time education

Posted 1 month ago

Apply

10.0 years

0 Lacs

Andhra Pradesh, India

On-site

10 years of hands-on experience in JAVA FULL STACK - JAVA SPRING BOOT Java 11, Spring Boot, Angular/React, REST APIs, Docker, Kubernetes, Microservices Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.

Posted 1 month ago

Apply

30.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About REA Group: In 1995, in a garage in Melbourne, Australia, REA Group was born from a simple question: “Can we change the way the world experiences property?” Could we? Yes. Are we done? Never. Fast forward 30 years, REA Group is a market leader in online real estate in three continents and continuing to grow rapidly across the globe. The secret to our growth is staying true to that ‘day one’ mindset; the hunger to innovate, the ambition to change the world, and the curiosity to reimagine the future. Our new Tech Center in Cyber City is dedicated to accelerating REA Group’s global technology delivery through relentless innovation. We’re looking for the best technologists, inventors and leaders in India to join us on this exciting new journey. If you’re excited by the prospect of creating something magical from scratch, then read on. What the role is all about: We are looking for an Engineering Manager with 9-16 years of experience to lead the Audience Data team within the Personalization & Privacy division of our Consumer Group. This team is tasked with managing a significant data asset comprising terabytes of user interactions from our website and apps. This data is crucial for our personalization efforts, machine learning, product insights and analytics, as well as customer reporting functions. You will collaborate closely with our AI team, providing rich datasets essential for developing predictive models and personalized recommenders to enhance the accuracy and effectiveness of our machine learning solutions. Additionally, these comprehensive datasets facilitate in-depth analysis and understanding of user behaviour, supporting data-driven decision-making across our business. While no two days are likely to be the same, your typical responsibilities will include: End to end technical delivery of complex initiatives under our product management pipeline using agile methods and frameworks working with cross-disciplinary teams. Provide technical leadership and guidance to the team, serving as a subject matter expert in data engineering and related technologies. Contribute to the design and architecture of scalable, efficient, and secure data solutions, considering long-term scalability and maintainability. Provide guidance, support, and leadership to the team. Establish effective ways of working within and across teams. Embrace continuous learning, leveraging latest development trends to solve complex challenges. Drive delivery practices with Delivery Lead running agile scrum ceremonies and producing agile artefacts. Contribute to the adoption of best practices, coding standards, and engineering principles across the team to ensure a high-quality and maintainable codebase. Collaborate with the development team to implement shift-left testing practices, ensuring early and frequent testing throughout the development lifecycle. Conduct performance analysis, optimization, and tuning of data processing workflows and systems to enhance efficiency and meet performance targets. Support the team’s iterations, scope, capacity, risks, issues, and timelines. Participate in technical discussions, code reviews, and architectural reviews to maintain code quality, identify improvement opportunities, and ensure adherence to standards. Mentor and coach engineers, fostering their professional growth, assisting them in overcoming technical challenges and create a culture of quality and efficiency, leading to reduced time-to-market and enhanced product quality. Collaborate with stakeholders to define data governance policies, data quality standards, and data management processes. Drive continuous improvement initiatives, such as automation, tooling enhancements, and process optimizations, to increase productivity and operational efficiency. Act as a strong advocate for data-driven decision-making, promoting a data-driven culture within the organization. Who we’re looking for: 9-16 years of experience working with platform and data engineering environments. Proven people leadership and mentoring experience Extensive experience in designing, coding, and testing data platform / management tools and systems. Excellent knowledge of software development principles and best practices. Proficiency in programming languages commonly used in platform and data engineering, such as Python, Java, or Go. Strong skills in analytical SQL. Experience with data engineering and any associated technologies such as dbt, Airflow, BigQuery / Snowflake, data lakes, Hive for ELT/ELT. Experience with data modelling methodologies like Kimball or Data Vault 2.0 preferred. Experience with Data Observability (Data Quality Monitoring) preferred. Exposure to, or knowledge of Kafka, Google Pubsub, Apache Flink (or Spark) and streaming SQL is preferred. Exposure to Linux and shell scripting. Experience with DevOps practices and techniques, such as Docker and CI/CD tools. Exposure to data management practices (data catalogues. data security) Excellent communication skills and the ability to collaborate effectively with business stakeholders and cross-functional teams. Ability to manage the competing demands of multiple projects in a timely manner. Effectively communicate complex solutions to audiences with varying technical backgrounds, fostering consensus and collaboration. Ability to work collaboratively and autonomously in a fast-paced environment. Willingness to learn new and complex technologies, and ability to share knowledge with the team. Bonus points for: Experience in using and managing Cloud infrastructure in AWS and / or GCP. Experience with any Infrastructure as Code techniques, particularly Terraform. Exposure to platform engineering concepts or developer experience & tooling. What we offer: A hybrid and flexible approach to working. Transport options to help you get to and from work, including home pick-up and drop-off. Meals provided on site in our office. Flexible leave options including parental leave, family care leave and celebration leave. Insurances for you and your immediate family members. Programs to support mental, emotional, financial and physical health & wellbeing. Continuous learning and development opportunities to further your technical expertise. The values we live by: Our values are at the core of how we operate, treat each other, and make decisions. We believe that how we work is equally important as what we do to achieve our goals. This commitment is at the heart of everything we do, from the way we interact with colleagues to the way we serve our customers and communities. Our commitment to Diversity, Equity, and Inclusion: We are committed to providing a working environment that embraces and values diversity, equity and inclusion. We believe teams with diverse ideas and experiences are more creative, more elective and fuel disruptive thinking - be it cultural and ethnic backgrounds, gender identity, disability, age, sexual orientation, or any other identity or lived experience. We know diverse teams are critical to maintaining our success and driving new business opportunities. If you've got the skills, dedication and enthusiasm to learn but don't necessarily meet every single point on the job description, please still get in touch. REA Group in India: You might already recognize our logo. The REA brand does have an existing presence in India. In fact, we set up our new tech hub in Gurugram to be their Neighbours! REA Group holds a controlling interest in REA India Pte. Ltd., operator of established brands Housing.com, Makaan.com and PropTiger.com, three of the country’s leading digital property marketplaces. Through our close connection to REA India, we’ve seen first-hand the incredible talent the country has to offer, and the huge opportunity to expand our global workforce. Our Cyber City Tech Center is an extension of REA Group; a satellite office working directly with our Australia HQ on local projects and tech delivery. All our brands, across the globe, connect regularly, learn from each other and collaborate on shared value initiatives.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

🌩️ Senior / Lead AWS Cloud Engineer 📍 Location: Kochi, Pune, and Chennai 🕒 Experience: 8+ Years 💼 Type: Full-time We are looking for a Senior or Lead AWS Cloud Engineer with a strong foundation in software development, cloud architecture, and DevOps practices. This is a hands-on technical leadership role, ideal for someone who thrives in building modern, scalable cloud-native systems. 🔧 Your Role and Responsibilities Design and develop scalable, secure, and reliable cloud-native applications on AWS Lead implementation of containerized environments using Kubernetes (EKS/OpenShift) Automate infrastructure using Terraform or AWS CDK Build and maintain CI/CD pipelines with GitLab CI, GitHub Actions, Jenkins, or ArgoCD Collaborate with cross-functional teams to ensure production-ready, high-quality solutions Mentor junior engineers and conduct code/architecture reviews Optimize performance and ensure observability using tools like Datadog 📚 Qualifications Bachelor’s/Master’s degree in Computer Science, Engineering, or a related field AWS or other Cloud Certifications (e.g., Solutions Architect, DevOps Engineer) are a plus Excellent communication and leadership skills ✅ Must-Have Skills (5+ Years) Proficiency in modern programming languages: TypeScript, Python, Go Strong experience with AWS Serverless (e.g., AWS Lambda) Deep understanding of AWS-managed databases: RDS, DynamoDB Hands-on with Kubernetes, AWS EKS/ECS, or OpenShift Proven CI/CD experience with tools like GitLab CI, GitHub, Jenkins, ArgoCD Familiarity with Git, Jira, Confluence 💡 Should-Have Skills (3+ Years) AWS networking: API Gateway AWS storage services: S3 Exposure to AWS AI/ML (e.g., SageMaker, Bedrock, Amazon Q) IaC tools: Terraform, AWS CDK 🌟 Nice-to-Have Skills (1+ Year) Experience with Amazon Connect or other Contact Center solutions Use of HashiCorp Vault for secrets management Knowledge of Kafka, Amazon MSK Familiarity with multi-cloud (Azure, GCP) Experience with monitoring tools: Datadog, Dynatrace, New Relic Understanding of FinOps and cloud cost optimization Knowledge of SSO technologies: OAuth2, OpenID Connect, JWT, SAML Working knowledge of Linux and shell scripting If you're passionate about modern cloud infrastructure, building developer-friendly systems, and leading engineering excellence — we’d love to hear from you! Apply with your resume to m.neethu@ssconsult.in

Posted 1 month ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About the Job: We are seeking a skilled and detail-oriented Database Activity Monitoring (DAM) Specialist to join our cybersecurity team. The ideal candidate will have hands-on experience in implementing and managing DAM solutions (like IBM Guardium, Imperva SecureSphere, Oracle Audit Vault, etc.), and ensuring continuous visibility into database activity for threat detection, compliance, and incident response. Key Responsibilities: Implement, configure, and maintain DAM solutions across critical database environments (Oracle, SQL Server, MySQL, etc.). Monitor and analyze database activities to detect unauthorized access, policy violations, and anomalous behavior. Define and fine-tune DAM policies, rules, alerts, and use cases. Perform regular audits and generate reports for compliance (e.g., PCI-DSS, GDPR, HIPAA, SOX). Work with database administrators, IT, and security teams to triage and respond to incidents. Ensure appropriate integration of DAM with SIEM tools and security workflows. Maintain documentation for processes, configurations, and incident handling. Support risk assessments and internal/external audits related to database security. Stay up to date with database threats, vulnerabilities, and security best practices. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Security, or related field. 5+ years of hands-on experience with DAM tools (e.g., IBM Guardium, Imperva, Oracle AVDF, etc.). Strong understanding of database platforms: Oracle, MS SQL, MySQL, PostgreSQL. Familiarity with SIEM tools (Splunk, QRadar, etc.) and integration with DAM solutions. Experience in writing and tuning DAM policies and alerts. Good knowledge of compliance regulations and audit requirements. Strong analytical and troubleshooting skills. Excellent written and verbal communication skills. Why Join Us? Work on mission-critical security infrastructure. Opportunity to shape enterprise-wide database security. Join a passionate team of cybersecurity professionals driving innovation.

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

About AuthKeeper AuthKeeper is a zero-knowledge authentication vault designed for modern security and privacy. We offer encrypted storage for TOTP secrets, passwords, secure notes, and credit card data — powered by client-side encryption, real-time sync via Supabase, and robust row-level security. Our mission is to create a product where data sovereignty and usability coexist. Whether you're a developer, privacy advocate, or security-conscious individual, AuthKeeper delivers military-grade protection with zero-trust architecture — ensuring your data remains private, even from us. Role Overview We’re hiring a Full-Stack Developer with strong experience in React , Supabase , and security-aware frontend/backend development . You’ll play a central role in maintaining and scaling our secure vault infrastructure, building user-centric features, and strengthening client-side cryptography and secure storage workflows. This is a hands-on role with high-impact responsibilities and direct influence over a security-first product. Responsibilities Design and develop secure features across the full stack (e.g., vault UI, TOTP, secure notes, password manager) Write scalable, privacy-preserving code using React, TailwindCSS, Supabase, and Netlify Functions Implement cryptographic workflows using Web Crypto API and AES-256-GCM Enforce strict Row Level Security in Supabase and manage PostgreSQL access policies Integrate secure session handling and auto-lock mechanisms for sensitive vault data Harden frontend components with strong CSP headers, input validation, and memory-safe design Collaborate with security engineers to address threat models and implement mitigation strategies Continuously audit and improve encryption practices to maintain zero-knowledge guarantees Contribute to a secure CI/CD pipeline with static analysis, secrets detection, and code linting Required Skills Strong hands-on experience with React , TypeScript/JavaScript , and Tailwind CSS Deep understanding of Supabase , particularly authentication, RLS, and real-time sync Familiarity with Netlify Functions or similar serverless environments Experience with client-side encryption , browser-based crypto (Web Crypto API), and secure session design Solid knowledge of zero-knowledge architecture , memory handling, and local key derivation (PBKDF2) Understanding of web security principles: XSS, CSRF, CSP, HTTPS, HSTS Git, CI/CD workflows, and clean modular architecture Proactive mindset with attention to security implications in every layer Nice to Have Experience building or contributing to password managers, encrypted storage apps, or MFA tools Familiarity with OAuth2 , TOTP generation, or browser extension security models Experience implementing Progressive Web Apps (PWAs) or offline-first apps Understanding of SSR (e.g., Next.js), advanced security headers, and anti-fingerprinting techniques Why Join AuthKeeper? Help build a product that prioritizes privacy, encryption, and user control Work independently with high ownership over core systems Collaborate with a mission-driven team on a modern stack Gain exposure to advanced cryptography, privacy tech, and real-world threat modeling Make an impact in a space where security is not an afterthought — it's the foundation How to Apply Send your GitHub, portfolio (or projects), and a short paragraph about why this mission excites you to: 📧 developers@authkeeper.dev

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Company Description AuthKeeper is a zero-knowledge authentication vault that combines military-grade encryption with a user-friendly interface. Designed with privacy and security at its core, AuthKeeper offers encrypted storage for TOTP secrets, passwords, credit cards, and secure notes — all protected by client-side encryption and a zero-trust architecture. Built on Supabase with real-time sync and row-level security, AuthKeeper ensures your data remains private — even from us. Role Overview We’re seeking a detail-oriented and performance-driven Advertising Specialist to join our team on a temporary, hybrid basis. In this role, you’ll be responsible for designing, executing, and optimizing advertising campaigns across digital channels. You’ll collaborate closely with our marketing and sales teams to ensure consistent brand messaging, measure campaign performance, and maximize ROI. Key Responsibilities Plan, launch, and manage digital advertising campaigns (e.g., Google Ads, Meta, LinkedIn) Analyze campaign data and performance metrics to optimize strategies Coordinate with marketing and sales teams to align messaging and target audience Monitor budgets, track spend, and report on ROI A/B test creatives, landing pages, and targeting approaches Stay up-to-date with advertising trends, tools, and industry best practices Qualifications Strong analytical and data interpretation skills Excellent written and verbal communication Proven experience in advertising, marketing, or sales Hands-on proficiency with digital ad platforms (Google Ads, Meta Ads, LinkedIn Campaign Manager, etc.) Ability to develop, test, and optimize ad strategies effectively Bachelor's degree in Marketing, Business, Communications, or a related field Experience in a tech or cybersecurity company is a plus Ready to drive high-impact advertising for a privacy-first startup? Apply today and help us grow the AuthKeeper brand.

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role: Data Engineer Location: Hyderabad Role Overview: The Azure Big Data Engineer is a hands-on technical role focused on designing, implementing, and maintaining robust, scalable data pipelines and storage solutions on the Azure platform. The engineer plays a critical role in enabling data-driven decision-making across the organization by building and managing high-performance data infrastructure. Key Responsibilities: Data Engineering & Pipeline Development Develop, test, and maintain ETL/ELT pipelines using tools such as Azure Data Factory, Azure Databricks, Synapse Analytics, and Microsoft Fabric. Ingest, wrangle, transform, and join data from various sources ensuring data quality and consistency. Implement data storage solutions using Azure Data Lake Storage, Lakehouse, Delta Lake, and data warehousing technologies (e.g., Synapse, Azure SQL Database). Performance Optimization & Monitoring Optimize data pipelines for cost, performance, scalability, and reliability. Monitor data flows and proactively troubleshoot pipeline and performance issues. Apply best practices for performance tuning and infrastructure alignment. Security & Compliance Implement data security measures including RBAC, data encryption, and compliance with organizational policies and regulatory standards. Address data governance requirements and audit readiness. Collaboration & Stakeholder Communication Work closely with data scientists, architects, and analysts to gather requirements and deliver solutions aligned with business needs. Translate business requirements into technical designs and documentation. Present architecture and design options to stakeholders and conduct technical demos. Expected Outcomes: Delivery & Execution Code pipelines and solutions following best practices in scalability, performance, and maintainability. Complete documentation for architecture, source-target mappings, test cases, and performance benchmarks. Perform unit testing, debugging, and performance validation of data pipelines. Quality & Process Adherence Ensure adherence to coding standards, project timelines, SLAs, and compliance protocols. Quickly resolve production bugs and reduce recurring issues through RCA. Improve pipeline efficiency (e.g., faster run times, reduced costs). Knowledge & Certifications Maintain up-to-date technical certifications and training. Contribute to reusable documentation, knowledge bases, and process improvements. Skills & Expertise: Technical Skills Programming: Proficiency in SQL, Python, and PySpark; familiarity with Scala. ETL Tools: Azure Data Factory, Azure Databricks, Microsoft Fabric, Synapse Analytics, Informatica, Glue, DataProc. Cloud Platforms: Expertise in Azure (including ADLS, Key Vault, Azure SQL, Synapse), familiarity with AWS or GCP a plus. Data Warehousing: Snowflake, BigQuery, Delta Lake, Lakehouse. Data Modeling: Strong understanding of dimensional modeling, schema design, and optimization for large datasets. Security: Knowledge of data security and compliance best practices. Soft Skills Strong analytical and troubleshooting skills. Excellent communication and collaboration capabilities. Ability to estimate and manage workload effectively. Customer-oriented approach with a focus on value delivery. Preferred Qualifications: Experience with Microsoft Fabric and modern data lakehouse architectures. Exposure to ML/AI concepts and integration with data pipelines. Azure certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect). Performance Metrics (KPIs): Adherence to engineering standards and timelines. Pipeline performance (run time, resource usage). Reduction in post-release defects and production incidents. Time to resolution for pipeline issues. Completion of certifications and mandatory training. Number of reusable assets created/shared. Compliance with data governance and security policies. Certifications: Azure Data Engineer Associate (preferred) Relevant domain certifications in data engineering, cloud, or big data technologies Tools & Technologies: ETL & Orchestration: Azure Data Factory, Databricks, Apache Airflow, Glue, Talend Cloud Services: Azure Synapse, ADLS, Azure SQL, Key Vault, Microsoft Fabric Programming: Python, SQL, PySpark, Scala (optional) Data Platforms: Snowflake, BigQuery, Delta Lake, Azure Lakehouse

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Pune, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Pune, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: aws,analytics,sales,sql,data,snowflake,etl/elt optimization,python,data warehousing,azure,data modeling,data governance,cloud

Posted 1 month ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Details: About the Role: Were seeking a high-performing Senior Sales Manager to drive revenue growth and strategic account development in the software and hardware solutions space, with a strong focus on industries such as Banking, Financial Services, and Government. This is an individual contributor role, ideal for someone who is self-motivated, target-driven, and experienced in selling self-service terminals or solutions in the retail space, sorting machines, payment systems, Managed services, ATM solutions and enterprise software products. Key Responsibilities: Own and drive the entire sales lifecycle from lead generation to deal closure. Develop new business opportunities across banking, retail, fintech, and public sector verticals. Promote and sell a portfolio of hardware and software solutions related to bank note processing, vault management. Prepare and deliver compelling client presentations, RFP responses, and technical/product demonstrations . Build and maintain CXO-level relationships to position strategic, long-term value . Collaborate with product and pre-sales teams to ensure solutions align with client needs . Consistently achieve or exceed quarterly and annual revenue targets. Required Experience & Skills: 10+ years of sales experience, with at least 5+ years in technology product sales (software and/or hardware). Strong domain knowledge in self-service terminals or solutions in the retail space, sorting machines, payment systems, Managed services, ATM solutions and enterprise software products. Proven track record of meeting/exceeding sales targets in an individual contributor capacity. Ability to navigate long sales cycles, large deal sizes, and enterprise-level customers. Familiarity with solution selling and value-based selling methodologies. Excellent communication, negotiation, and stakeholder management skills. Preferred Qualifications: Prior experience working with Banking industry, Automation banking, Bank Notes. Understanding of SaaS, IoT, or cloud-based solutions is a plus. Bachelor's degree in Business, Engineering, or related field; MBA is an advantage.

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Thane, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Mumbai Metropolitan Region

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Thane, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title : Senior Associate – CyberArk Implementation Experience : 6–10 Years Location : Bangalore / Hyderabad Employment Type : Full-time Level : Senior Associate Key Responsibilities Lead and participate in design, deployment, and support of CyberArk Privileged Access Management (PAM) solutions. Onboard privileged accounts for platforms including Windows, UNIX/Linux, Mainframe, Databases, AWS/Azure. Implement and maintain CyberArk components: PVWA, CPM, PSM, PSMP, PTA, AIM, Conjur. Develop and customize connectors, policies, and workflows. Manage password rotation policies, session recording, and reporting. Monitor and remediate PAM alerts and issues in production. Apply best practices for Identity and Access Governance. Create system documentation and user guides for CyberArk environments. Support audit and compliance reporting using PAM logs. Collaborate with cross-functional teams including Security, Infrastructure, and Application teams. Mandatory Skills 4+ years of hands-on experience implementing CyberArk PAM. Proficiency in onboarding systems (Windows, Linux/Unix, Databases, Mainframe, Cloud). Experience with CyberArk CDE, PVWA, CPM, PSM, PSMP, PTA. Certified CyberArk Sentry (MANDATORY). Strong knowledge of IAM concepts and privileged access workflows. Good understanding of IT general controls and audit compliance. Nice to Have Knowledge of other PAM tools like BeyondTrust, Thycotic, or Centrify. Scripting skills in PowerShell, Python, or Bash for automation. Experience with Identity Governance tools like SailPoint or Saviynt. Exposure to DevOps environments and Secret Management tools like HashiCorp Vault. Education & Certification Bachelor’s or Master’s degree in Computer Science, Information Security, or related fields. CyberArk Sentry Certification (Mandatory). CyberArk CDE/Defender/Guardian is a plus. Soft Skills Strong written and verbal communication skills. Ability to work in a collaborative, team environment. Excellent problem-solving and analytical thinking. Skills: cyberark cde,saviynt,aws,privileged access management,hashicorp vault,bash,cyberark,unix/linux,python,iam concepts,aim,windows,psm,cpm,pta,pvwa,linux,psmp,databases,cyberark pam,powershell,conjur,devops,sailpoint,mainframe,azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Nashik, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Nashik, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Solapur, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Solapur, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

5.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Required Qualifications & Skills: 5+ years in DevOps, SRE, or Infrastructure Engineering. Strong expertise in Azure Cloud & Infrastructure-as-Code (Terraform, CloudFormation). Proficient in Docker & Kubernetes. Hands-on with CI/CD tools & scripting (Bash, Python, or Go). Strong knowledge of Linux, networking, and security best practices. Experience with monitoring & logging tools (ELK, Prometheus, Grafana). Familiarity with GitOps, Helm charts & automation Key Responsibilities: Design & manage CI/CD pipelines (Jenkins, GitLab CI/CD, GitHub Actions). Automate infrastructure provisioning (Terraform, Ansible, Pulumi). Monitor & optimize cloud environments Implement containerization & orchestration (Docker, Kubernetes - EKS/GKE/AKS). Maintain logging, monitoring & alerting (ELK, Prometheus, Grafana, Datadog). Ensure system security, availability & performance tuning. Manage secrets & credentials (Vault, Secrets Manager). Troubleshoot infrastructure & deployment issues. Implement blue-green & canary deployments. Collaborate with developers to enhance system reliability & productivity Preferred Skills: Certification -Azure Devops Engineer Experience with multi-cloud, microservices, event-driven systems. Exposure to AI/ML pipelines & data engineering workflows.

Posted 1 month ago

Apply

5.0 - 8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About us Bain & Company is a global management consulting firm that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with This position is based in the BCN’s Gurgaon office and is an integral part of the Performance Improvement Centre of Expertise (PI CoE). PI CoE helps Bain case teams in India and abroad on multiple cost reduction/optimization activities through a suite of solutions ranging from Procurement (Spend cube, analytics, maturity assessment, vendor strategy etc.), Supply Chain (Inventory Optimization, Network Optimization, Manufacturing Diagnostics, Integrated Business Planning etc.) to cost diagnostics (Net Working Capital, Value Calculator etc.), savings and PMI Analysis. The CoE undertakes complex and advanced data analysis to analyze client’s data and then generates critical insights to help achieve the cost objective. Over time we have built seamless solutions and use powerful, dynamic visualizations/charts on multiple platforms (Power BI, Tableau) to showcase our results. PI CoE is also involved in generation of critical IP useful for new performance improvement cases globally at Bain. What you’ll do We are looking for a high-performing Project Leader to join our Commercial Acceleration team supporting the Performance Improvement (PI) practice area. This team plays a critical role in driving proposal and client development efforts by partnering with Bain's consulting teams and subject-matter experts. In this role, you will lead a team comprising Associates and Analysts to deliver high-impact, client-ready proposals, sales pitches, and go-to-market materials. You will also support the development of intellectual property (IP), thought leadership, and Bain’s internal capabilities. The role requires strong problem-solving skills, an understanding of business operations and cost transformation, and the ability to manage multiple stakeholders across levels. Key Responsibilities: / Lead a proposal engagement across the Performance Improvement domain, including Procurement, Operations, Manufacturing, Supply Chain, and Cost Transformation / Collaborate with Partners and Account teams to develop high-quality proposals and sales materials in a client-ready format / Understand and apply Bain’s PI solutions and methodologies in the context of commercial proposals / Analyze financial statements to identify cost optimization opportunities / Use benchmarking data (internal and external) to identify and validate value creation opportunities across business functions / Drive structured problem solving and storytelling for client situations and ensure alignment with Bain’s quality standards / Manage client and internal team meetings effectively, present findings to senior leaders and Partners in a clear and compelling manner / Foster collaboration with other BCN and global teams to ensure integrated proposal development and knowledge sharing / Ensure high-quality output across workstreams through rigorous quality control and attention to detail / Interpret data and distill insights with clear business implications and recommendations / Provide regular, constructive feedback to team members; actively mentor and develop talent for future leadership roles / Own workstream staffing, monitor team capacity, and proactively resolve any overload or resourcing issues / Contribute to broader office initiatives such as recruiting, training, and business development as needed About you / Work experience range in case highest qualification is undergraduate studies – 5-8 years of relevant experience in cost transformation, procurement, or performance improvement roles, with a strong academic record / Work experience range in case highest qualification is postgraduate studies – 3-6 years of relevant experience in cost transformation, procurement, or performance improvement roles, with a strong academic record / Proven experience in consulting or proposal development, preferably within the Performance Improvement domain / Deep understanding of operational levers in cost transformation, procurement, and supply chain management / Strong analytical, communication, and project management skills / Proficient in both written and spoken English / Advanced proficiency in MS Excel and PowerPoint / Demonstrated ability to manage teams and consistently deliver high-quality outputs under tight deadlines / Strong stakeholder management skills with the ability to collaborate effectively across functions and geographies What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents ..

Posted 1 month ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

SRE is part of an application team matrixed to the Cloud Services Team to perform a specialized function that focuses on the automation of availability, performance, maintainability and optimization of business applications on the platform. To be effective in the position, a SRE must have strong AWS, Terraform and GitHub skills as the platform is 100% automated. All changes being applied to the environment must be automated with Terraform and checked into GitHub version control. A matrixed SRE will be provided the Reliability Engineering role in the accounts they are responsible for. This role includes the rights to perform all the necessary functions required to support the applications in the IaaS environment. An SRE is required to adhere to all Enterprise processes and controls (ie ChgMgt, Incident and Problem Mgmt, etc) and ensure alignment to Cloud standards and best practices. Ability to write and implement infrastructure as code and platform automation Experience implementing Infrastructure as Code Terraform Collaborate with Cloud Services and Application teams to deliver projects Deploy infrastructure as code (IaC) releases to QA, staging, and production environments Responsible for building the automation for any account customizations required by the application custom roles, policies, security groups, etc DevOps Engineer Should be having Minimum 5years of working experience especially as DevOps Engineer/SRE Should be working as IC role with very good communication skills Verbal & Written OS Knowledge Should have 3Years hands on working experience on Linux SCM Should have 3 years of hands on working experience in Git Preferably GitHub Enterprise Cloud Experience: Should have a thorough knowledge of AWS Certification is preferred CICD Tool 4 Years hands on working experience in Jenkins If not any other CICD tool EKS CICD Working experience with Jenkins and if not any other CICD tool for EKS. Jenkins Pipeline script hands on experience with pipeline script is preferred. Containers Minimum 1 Year hands-on working experience in Docker/Kubernetes. Preferred if candidate is certified CKA(Certified Kubernetes Administrator) Mulesoft Runtime Fabric Install configure Anypoint Runtime Fabric environment and deploy application on runtime fabric. Cloud Infra Provisioning Tool 2 Years hands on working experience in Terraform/ Terraform Enterprise/Cloud Formation Application Provisioning Tool 2 Years hands on working experience in Puppet/Ansible/Chef Data Components Should have good knowledge and Min 1 year of working experience with ELK, Kafka, Zookeeper HDF knowledge added advantage Tools Consul Vault Knowledge is added advantage Scripting Knowledge 3 years hands on working experience on any scripting language Shell/Python/Ruby etc Very good troubleshooting skills and should have hands on working experience in production deployments and Incidents. Mulesoft Knowledge Added advantage Java SpringBoot Knowledge Added advantage.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies