Jobs
Interviews

615 Masking Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Minimum 3 years of experience in working TestDataManagement(TDM) Hands-on experience in handling tools like CA FastDataMasker, Informatica, IBM optim. Exposure in data masking/Obfuscation Hands on experience in SQL, along with multiple databases like Oracle, SQL Server, GreenPlum etc. Hands on experience in Data Profiling, Data masking and reporting. Experience in training and mentoring juniors. Experience in team handling not more than 3-4 associates. Hands Experience working in offshore-onshore model, with good communication skills. Hands on experience in Java and Python will be a plus.

Posted 1 month ago

Apply

5.0 - 6.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

About Godrej Agrovet Godrej Agrovet Limited (GAVL) is a diversified, Research & Development focused agri-business Company dedicated to improving the productivity of Indian farmers by innovating products and services that sustainably increase crop and livestock yields. GAVL holds leading market positions in the different businesses it operates - Animal Feed, Crop Protection, Oil Palm, Dairy, Poultry and Processed Foods. GAVL has a pan India presence with sales of over a million tons annually of high-quality animal feed and cutting- edge nutrition products for cattle, poultry, aqua feed and specialty feed. Our teams have worked closely with Indian farmers to develop large Oil Palm Plantations which is helping in bridging the demand and supply gap of edible oil in India. In the crop protection segment, the company meets the niche requirement of farmers through innovative agrochemical offerings. GAVL through its subsidiary Astec Life Sciences Limited, is also a business-to-business (B2B) focused bulk manufacturer of fungicides & herbicides. In Dairy and Poultry and Processed Foods, the company operates through its subsidiaries Creamline Dairy Products Limited and Godrej Tyson Foods Limited. Apart from this, GAVL also has a joint venture with the ACI group of Bangladesh for animal feed business in Bangladesh. For more information on the Company, please log on to www.godrejagrovet.com . Designation Location Mumbai Job Purpose We are seeking a highly skilled and experienced IT & OT Infrastructure, Data, and Applications Security Manager to lead the security strategy and implementation for IT & OT (Operational Technology) environments. This role is responsible for ensuring that critical infrastructure, network systems, and applications are secure from cyber threats while ensuring operational continuity in both the IT and OT domains. The position requires a deep understanding of both IT and OT security frameworks, as well as an ability to collaborate with cross-functional teams to safeguard digital assets and operations. Roles & Responsibilities IT & OT Infrastructure Security: Develop, implement, and maintain security policies, procedures, and controls to protect IT & OT infrastructure components, including servers, networks, industrial control systems (ICS), SCADA, and cloud environments. Collaborate with IT teams to ensure secure integration between IT and OT systems, addressing the unique security requirements of each domain. Conduct regular risk assessments, vulnerability scans, and penetration tests to identify and mitigate threats in IT & OT infrastructures. Manage the security of industrial networks, SCADA systems, and IIoT (Industrial Internet of Things) devices to prevent cyber threats and ensure safe operations. Implement and maintain security for cloud services, on-premises data centers, and critical OT assets, ensuring compliance with industry standards. Data Security: Implement data encryption, tokenization, and masking techniques to protect sensitive and proprietary data across systems, databases, and storage devices. Oversee data classification processes and ensure data protection in compliance with legal and regulatory requirements (GDPR, CCPA, HIPAA, etc.). Ensure proper data backup, disaster recovery, and business continuity planning related to data security. Conduct data loss prevention (DLP) assessments and implement preventative controls. Manage access control policies for databases and ensure segregation of duties for sensitive information. Network Security: Develop and maintain robust network security architecture for IT & OT networks, ensuring protection against unauthorized access, data breaches, and cyber-attacks. Monitor and analyze network traffic and logs to detect potential threats, vulnerabilities, and anomalous activities across IT & OT networks. Implement network segmentation to isolate IT and OT environments while ensuring controlled data exchange between systems. Configure and manage firewalls, intrusion detection/prevention systems (IDS/IPS), and secure VPNs to protect networks from external and internal threats. Manage secure communication channels for IT/OT devices and ensure the proper functioning of secure remote access protocols for IT/OT systems. Applications Security: Lead the implementation of secure application development practices for OT applications. Work with development and OT engineering teams to incorporate secure coding practices into OT software systems. Conduct regular security assessments and code reviews for applications, ensuring that vulnerabilities are identified and mitigated. Oversee security testing of OT applications, including SCADA systems, human-machine interfaces (HMIs), and industrial control software, to ensure that security controls are in place. Implement security controls around application access, user authentication, and data integrity for OT applications. Incident Response & Threat Management: Lead and coordinate response efforts to security incidents involving OT systems, ensuring that containment, investigation, and remediation processes are followed efficiently. Develop and maintain incident response plans that address OT-specific risks, ensuring minimal disruption to critical operations. Conduct post-incident analysis to identify root causes, recommend improvements, and apply corrective actions to prevent future occurrences. Collaborate with internal and external teams (e.g., law enforcement, vendors) during security incidents that may impact OT systems. Security Governance and Compliance: Ensure compliance with relevant industry regulations, standards, and frameworks (e.g., NIST, ISO 27001, IEC 62443, NERC CIP) in OT environments. Implement and enforce security governance, risk management, and compliance strategies across OT assets. Perform regular audits and assessments of OT security controls to ensure compliance with security policies and regulatory requirements. Maintain comprehensive security documentation, including risk assessments, incident reports, and security project plans. Security Awareness and Training: Develop and conduct security awareness training programs for OT staff, ensuring that they are educated on security best practices, emerging threats, and organizational policies. Provide ongoing education to the OT team about the importance of cybersecurity in the context of industrial operations and critical infrastructure. Stay current with emerging security trends, threats, and vulnerabilities specific to OT environments and incorporate new knowledge into security practices. Educational Qualification : Bachelor's degree in Computer Science, Information Security, Cybersecurity, Engineering, or a related field (Master’s preferred). Experience Minimum of 5 to 6 years of experience in IT & OT security, Data security, and application security. Extensive experience securing both OT (industrial control systems, SCADA, ICS, IIoT) environments. Proven experience with network segmentation, firewalls, IDS/IPS, VPNs, and application security frameworks. Familiarity with securing operational technology, including understanding of industrial protocols (Modbus, OPC, DNP3, etc.). Hands-on experience with OT vulnerability management, incident response, and threat intelligence processes. Skills Expertise in securing network and infrastructure devices, systems, and industrial control systems (ICS). Deep knowledge of network protocols and security mechanisms (e.g., IP, TCP/IP, VPNs, firewalls). Proficiency in securing cloud environments (AWS, Azure, Google Cloud) as well as on-premises systems. Experience with tools for vulnerability scanning, penetration testing, and risk assessments (e.g., Nessus, Qualys, Burp Suite). Certifications: CISSP, CISM, CISA, or similar certifications are preferred. OT-specific certifications such as Certified SCADA Security Architect (CSSA) or IEC 62443 certification a plus. Network security certifications such as CCSP, AWS Certified Security Specialty, or CCNA Security are beneficial. Application security certifications (e.g., CEH, OWASP) are a bonus. An inclusive Godrej Before you go, there is something important we want to highlight. There is no place for discrimination at Godrej. Diversity is the philosophy of who we are as a company. And has been for over a century. It’s not just in our DNA and nice to do. Being more diverse - especially having our team members reflect the diversity of our businesses and communities - helps us innovate better and grow faster. We hope this resonates with you. We take pride in being an equal opportunities employer. We recognize merit and encourage diversity. We do not tolerate any form of discrimination on the basis of nationality, race, color, religion, caste, gender identity or expression, sexual orientation, disability, age, or marital status and ensure equal opportunities for all our team members. If this sounds like a role for you, apply now! We look forward to meeting you.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description – Delphix TDM Professionals (Healthcare Domain) We are hiring for multiple positions in Test Data Management (TDM) with strong expertise in Delphix . The ideal candidates should have experience in data de-identification, masking , and synthetic data generation , preferably in healthcare environments. 🔹 General Requirements (All Roles) Minimum 5 years of experience in Test Data Management tools Mandatory experience with Delphix (Data Virtualization & Masking) Strong knowledge of Data De-identification & Masking Minimum 2 years of experience in Synthetic Data Generation Experience in aligning TDM with project roadmaps for faster test data delivery Nice to Have: Python, .NET knowledge, and exposure to CI/CD pipelines or cloud-hosted platforms 💼 Open Positions Skills: sql,cloud,ci/cd pipelines,delphix,performance tuning,python,synthetic data generation,data de-identification,cloud-hosted platforms,data masking,test data,design,data virtualization,tdm,.net,shell scripting,oracle Delphix Tech Lead (1 Role) Lead end-to-end Delphix solution design & implementation Drive strategy, architecture, and team guidance Collaborate across enterprise environments Delphix Senior Engineer (4 Roles) Design, deploy, and optimize Delphix virtualization & masking solutions Mentor junior team members Support best practices and innovation Delphix Engineer (2 Roles) Implement and manage Delphix environments Support automation and integration with pipelines Ensure performance in test data delivery Delphix Support Engineer (2 Roles) Provide operational support and troubleshooting for Delphix platforms Ensure platform availability and resolve issues quickly

Posted 1 month ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description – Delphix TDM Professionals (Healthcare Domain) We are hiring for multiple positions in Test Data Management (TDM) with strong expertise in Delphix . The ideal candidates should have experience in data de-identification, masking , and synthetic data generation , preferably in healthcare environments. 🔹 General Requirements (All Roles) Minimum 5 years of experience in Test Data Management tools Mandatory experience with Delphix (Data Virtualization & Masking) Strong knowledge of Data De-identification & Masking Minimum 2 years of experience in Synthetic Data Generation Experience in aligning TDM with project roadmaps for faster test data delivery Nice to Have: Python, .NET knowledge, and exposure to CI/CD pipelines or cloud-hosted platforms 💼 Open Positions Skills: sql,cloud,ci/cd pipelines,delphix,performance tuning,python,synthetic data generation,data de-identification,cloud-hosted platforms,data masking,test data,design,data virtualization,tdm,.net,shell scripting,oracle Delphix Tech Lead (1 Role) Lead end-to-end Delphix solution design & implementation Drive strategy, architecture, and team guidance Collaborate across enterprise environments Delphix Senior Engineer (4 Roles) Design, deploy, and optimize Delphix virtualization & masking solutions Mentor junior team members Support best practices and innovation Delphix Engineer (2 Roles) Implement and manage Delphix environments Support automation and integration with pipelines Ensure performance in test data delivery Delphix Support Engineer (2 Roles) Provide operational support and troubleshooting for Delphix platforms Ensure platform availability and resolve issues quickly

Posted 1 month ago

Apply

0 years

0 Lacs

Delhi, India

On-site

Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude.

Posted 1 month ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

JOB_POSTING-3-71879-1 Job Description Role Title : AVP, Enterprise Logging & Observability (L11) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Splunk is Synchrony's enterprise logging solution. Splunk searches and indexes log files and helps derive insights from the data. The primary goal is, to ingests massive datasets from disparate sources and employs advanced analytics to automate operations and improve data analysis. It also offers predictive analytics and unified monitoring for applications, services and infrastructure. There are many applications that are forwarding data to the Splunk logging solution. Splunk team including Engineering, Development, Operations, Onboarding, Monitoring maintain Splunk and provide solutions to teams across Synchrony. Role Summary/Purpose The role AVP, Enterprise Logging & Observability is a key leadership role responsible for driving the strategic vision, roadmap, and development of the organization’s centralized logging and observability platform. This role supports multiple enterprise initiatives including applications, security monitoring, compliance reporting, operational insights, and platform health tracking. This role lead platform development using Agile methodology, manage stakeholder priorities, ensure logging standards across applications and infrastructure, and support security initiatives. This position bridges the gap between technology teams, applications, platforms, cloud, cybersecurity, infrastructure, DevOps, Governance audit, risk teams and business partners, owning and evolving the logging ecosystem to support real-time insights, compliance monitoring, and operational excellence. Key Responsibilities Splunk Development & Platform Management Lead and coordinate development activities, ingestion pipeline enhancements, onboarding frameworks, and alerting solutions. Collaborate with engineering, operations, and Splunk admins to ensure scalability, performance, and reliability of the platform. Establish governance controls for source naming, indexing strategies, retention, access controls, and audit readiness. Splunk ITSI Implementation & Management - Develop and configure ITSI services, entities, and correlation searches. Implement notable events aggregation policies and automate response actions. Fine-tune ITSI performance by optimizing data models, summary indexing, and saved searches. Help identify patterns and anomalies in logs and metrics. Develop ML models for anomaly detection, capacity planning, and predictive analytics. Utilize Splunk MLTK to build and train models for IT operations monitoring. Security & Compliance Enablement Partner with InfoSec, Risk, and Compliance to align logging practices with regulations (e.g., PCI-DSS, GDPR, RBI). Enable visibility for encryption events, access anomalies, secrets management, and audit trails. Support security control mapping and automation through observability. Stakeholder Engagement Act as a strategic advisor and point of contact for business units, application, infrastructure, security stakeholders and business teams leveraging Splunk. Conduct stakeholder workshops, backlog grooming, and sprint reviews to ensure alignment. Maintain clear and timely communications across all levels of the organization. Process & Governance Drive logging and observability governance standards, including naming conventions, access controls, and data retention policies. Lead initiatives for process improvement in log ingestion, normalization, and compliance readiness. Ensure alignment with enterprise architecture and data classification models. Lead improvements in logging onboarding lifecycle time, automation pipelines, and selfservice ingestion tools. Mentor junior team members and guide engineering teams on secure, standardized logging practices. Required Skills/Knowledge Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Splunk Subject Matter Expert (SME) Strong hands-on understanding of Splunk architecture, pipelines, dashboards, and alerting, data ingestion, search optimization, and enterprise-scale operations. Experience supporting security use cases, encryption visibility, secrets management, and compliance logging. Splunk Development & Platform Management, Security & Compliance Enablement, Stakeholder Engagement & Process & Governance Experience with Splunk Premium Apps - ITSI and Enterprise Security (ES) minimally Experience with Data Streaming Platforms & tools like Cribl, Splunk Edge Processor. Proven ability to work in Agile environments using tools such as JIRA or JIRA Align. Strong communication, leadership, and stakeholder management skills. Familiarity with security, risk, and compliance standards relevant to BFSI. Proven experience leading product development teams and managing cross-functional initiatives using Agile methods. Strong knowledge and hands-on experience with Splunk Enterprise/Splunk Cloud. Design and implement Splunk ITSI solutions for proactive monitoring and service health tracking. Develop KPIs, Services, Glass Tables, Entities, Deep Dives, and Notable Events to improve service reliability for users across the firm Develop scripts (python, JavaScript, etc.) as needed in support of data collection or integration Develop new applications leveraging Splunk’s analytic and Machine Learning tools to maximize performance, availability and security improving business insight and operations. Support senior engineers in analyzing system issues and performing root cause analysis (RCA). Desired Skills/Knowledge Deep knowledge of Splunk development, data ingestion, search optimization, alerting, dashboarding, and enterprise-scale operations. Exposure to SIEM integration, security orchestration, or SOAR platforms. Knowledge of cloud-native observability (e.g. AWS/GCP/Azure logging). Experience in BFSI or regulated industries with high-volume data handling. Familiarity with CI/CD pipelines, DevSecOps integration, and cloud-native logging. Working knowledge of scripting or automation (e.g., Python, Terraform, Ansible) for observability tooling. Splunk certifications (Power User, Admin, Architect, or equivalent) will be an advantage . Awareness of data classification, retention, and masking/anonymization strategies. Awareness of integration between Splunk and ITSM or incident management tools (e.g., ServiceNow, PagerDuty) Experience with Version Control tools – Git, Bitbucket Eligibility Criteria Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Demonstrated success in managing large-scale logging platforms in regulated environments. Excellent communication, leadership, and cross-functional collaboration skills. Experience with scripting languages such as Python, Bash, or PowerShell for automation and integration purposes. Prior experience in large-scale, security-driven logging or observability platform development. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to interact effectively with team members and stakeholders. Knowledge of IT Service Management (ITSM) and monitoring tools. Knowledge of other data analytics tools or platforms is a plus. WORK TIMINGS : 01:00 PM to 10:00 PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L9+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L09+ Employees can apply. Level / Grade : 11 Job Family Group Information Technology

Posted 1 month ago

Apply

9.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Position Summary... What you'll do... This Position is with the Data Platform Engineering team (People Data Platforms) under the Enterprise Business Services – People Technology organisation focusing on Enterprise area of People Systems which is amid a massive digital transformation. The objective of the People tech organization is to build the best-in-class engineering, analytics and data science solutions that power the best experience for our people, adhering to the Walmart philosophy - Everyday Low Cost. People Data team is responsible to build and maintain People Data Lake platform and other analytical products that aim to democratize access to HR data, enabling tech teams and business users across Walmart with relevant, timely data by streamlining acquisition, curation and consumption of data from various HR systems. The Team supports multiple use-cases that focus on providing engaging employee experiences resulting in global company success. The team is spread over multiple locations, and we work towards providing the best experience to Walmart associates and Business Stakeholders. We are seeking an accomplished Staff Software Engineer to join our Data Platform Engineering team. This critical role is for a passionate technical leader eager to architect, design, and implement robust, scalable, and secure data platforms and pipelines. Leveraging deep expertise in big data technologies, cloud security, and data governance, you will tackle complex, ambiguous challenges across multiple teams and critical business initiatives. Beyond hands-on contributions, you will mentor engineers, champion best practices, and significantly influence the overall technical direction and culture of the organization. Staff Engineers at Walmart lead through unparalleled technical excellence, strategic thinking, and cross-functional influence What You'll Do : Define, drive, and be accountable for the technical strategy and architectural vision for the People Data Lake platform, data pipelines, and analytical products, ensuring alignment with overall business and engineering objectives. Lead the design and implementation of complex, multi-functional data solutions, particularly focusing on data security, access control, encryption, and privacy (e.g., Sensitive Data Protection framework, RBAC/ABAC, dynamic masking). Identify and evaluate critical technical challenges and opportunities across the engineering landscape, proposing innovative and impactful solutions that can span multiple teams. Champion and advocate for best practices in data engineering, software development, CI/CD, data quality, and operational excellence within the team and across the organization. Architect, develop, and maintain robust and scalable batch and streaming data pipelines using Google Data-proc, Apache Spark, Structured Streaming, and Airflow for orchestration. Design, develop, and maintain data security, access control, and encryption solutions within the data platform, working extensively with Google Cloud Platform (GCP) services including Big Query, GCS, IAM, and KMS to develop secure data solutions. Implement and optimize data transformation, integration, and consumption layers (Medallion Architecture) to deliver high-quality, actionable data for diverse use cases. Design and implement secure APIs and microservices for seamless data access and integration. Utilize Infrastructure as Code (IaaS) tools like Ansible and Terraform for automated cloud deployments and infrastructure management. Lead the technical implementation of data governance frameworks, policies, and standards, with a strong focus on data privacy, security (encryption, masking), and regulatory compliance (e.g., GDPR, HIPAA, SOC 2). Collaborate closely with data governance teams and utilize tools like Collibra, EDP, and GCP Data Catalog to ensure proper classification, metadata management, and secure handling of sensitive HR data. Provide expert guidance and establish best practices over the management of data assets, including data quality, retention, and accessibility. Serve as a primary technical mentor, providing coaching and guidance to Senior and Staff Software Engineers, fostering their growth and development in complex technical areas, architectural design, and problem-solving methodologies. Cultivate a culture of technical excellence, continuous learning, and innovation within the team. Collaborate cross-functionally with product managers, business stakeholders, data scientists, and other engineering teams to translate complex business requirements into technical strategies and deliver impactful solutions. Effectively manage multiple initiatives by delivering and delegating as appropriate, ensuring completion of assigned tasks, and representing tasks and technical direction to stakeholders. Address ambiguous, high-impact technical problems that span multiple systems or teams, driving them to resolution and enabling team productivity. Evaluate new technologies and approaches, recommending their adoption where they can provide significant business value or technical advantage. Proactively identify and resolve systemic issues in data architecture, pipelines, and processes. What You'll Bring: Bachelor's degree in Computer Science, Engineering, or a related field, and 9-12 years of experience in software engineering, with a significant focus on data platforms. Strong programming skills in Python, Java, or Scala. Proven experience as a Staff Software Engineer or in a similar senior technical leadership role. Strong problem-solving skills and the ability to work in fast-paced environments. Demonstrable experience with Infrastructure as Code (IaaS) tools like Ansible and Terraform for Cloud Deployments. Demonstrable experience in designing secure APIs and microservices, including Common API Architecture Styles (REST, GraphQL, gRPC, RPC). Expertise in distributed data processing frameworks, like Apache Spark on Google Data-proc, Kafka, Hadoop, Flink, or Apache Beam. Expertise in database technologies and distributed datastores (e.g., SQL, NoSQL, MPP databases such as BigQuery). Strong understanding of API Design Principles & NFR's (Scalability, Maintainability, Availability, Reliability). Hands-on experience with Google Cloud Platform (GCP) and its security-related services (IAM, KMS, Cloud Audit Logs, etc.). Exposure to data engineering techniques, including ETL pipeline development, data ingestion, and data wrangling. Solid understanding of security frameworks like OAuth, OpenID Connect, and JWT-based authentication. Experience designing secure APIs and microservices. Knowledge of data governance, compliance (GDPR, HIPAA, SOC 2, etc.), and regulatory requirements. Familiarity with encryption standards and cryptographic protocols. Experience with data orchestration tools such as Apache Airflow or Google Cloud Composer. Experience with streaming data systems like Kafka and Google Pub/Sub. Knowledge of containerisation (e.g., Docker, Kubernetes) and how to deploy and scale data engineering workloads in cloud environments. About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions – while being inclusive of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, computer engineering, computer information systems, software engineering, or related area and 4 years’ experience in software engineering or related area.Option 2: 6 years’ experience in software engineering or related area. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Master’s degree in Computer Science, Computer Engineering, Computer Information Systems, Software Engineering, or related area and 2 years' experience in software engineering or related area Primary Location... Rmz Millenia Business Park, No 143, Campus 1B (1St -6Th Floor), Dr. Mgr Road, (North Veeranam Salai) Perungudi , India R-2211097

Posted 1 month ago

Apply

10.0 years

3 - 6 Lacs

Hyderābād

On-site

Job Requirements About Phenom At Phenom, our purpose is to help a billion people find the right job. We’re a global HR tech company delivering AI-powered talent experience solutions for enterprise organizations. Our intelligent platform helps companies attract, engage, and retain top talent. Role Summary We are looking for a Principal DBOps Engineer to lead the strategy, performance, automation, and scalability of our database systems. You will be the go-to expert for everything related to database operations, including reliability, observability, automation, and infrastructure-as-code across multiple environments. This is a hands-on leadership role with strong influence on architecture, security, and database lifecycle management. Key Responsibilities Design and manage highly available, scalable, and secure database architectures across production and non-production environments. Automate database provisioning, monitoring, backup, and recovery workflows using DevOps tools and Infrastructure-as-Code (IaC). Partner with Engineering, DevOps, and Product teams to ensure database performance, reliability, and data integrity. Lead incident response and RCA (Root Cause Analysis) for any database-related issues and outages. Guide the team on best practices around schema management, indexing strategies, and query performance. Mentor and lead a team of DB Engineers and collaborate cross-functionally with SREs, DevOps, and Cloud Architects. Establish data governance, auditing, and compliance protocols across multi-cloud environments (AWS, Azure, etc.). Evaluate and implement database observability solutions (Prometheus, Grafana, etc.). Optimize costs through usage monitoring, capacity planning, and right-sizing of cloud-based DB infrastructure. Skills & Qualifications Bachelor’s/Master’s degree in Computer Science, Information Systems, or related field. 10+ years of experience in database administration and operations in high-scale production environments. Deep expertise in PostgreSQL, MySQL, MongoDB , or similar databases (relational and NoSQL). Proven experience in cloud-native DBOps (AWS RDS/Aurora, Azure SQL, GCP Cloud SQL, etc.). Strong scripting experience (Python, Bash, or Go) and use of automation frameworks (Ansible, Terraform). Exposure to containerization and orchestration (Kubernetes, Helm). Experience with CI/CD pipelines for DB changes and automated testing (Liquibase/Flyway). Solid understanding of database security, data masking, encryption, and access control models. Excellent communication, stakeholder management, and technical leadership skills. Nice to Have Certifications in cloud platforms (AWS, Azure, GCP) Experience with multi-region replication and disaster recovery design Contributions to open-source DB tools or platforms Why Phenom? Work with cutting-edge technologies in a high-impact role Be part of a fast-growing product company solving real-world problems at scale Culture focused on innovation, continuous learning, and collaboration

Posted 1 month ago

Apply

0.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

HMX Media Pvt. Ltd. is one of the fastest growing CGI advertising studios with a team of professionals, experienced artists & technologists who create engaging visual experiences for international blue chip clients. We specialize in crafting immersive experiences that attract & hold the audiences across various platforms. We cater a range of services to tell stories through dynamic videos, striking photography & powerful real time 3D interaction to stay relevant in this engaging & vibrant industry. We are looking for talented & skillful Photoshop Artist. Please check the below details Job Title : Photoshop (Retouch) Artist Experience : 0-5 Year (Fresher may also apply) Required Software : Photoshop Illustrator Required Skills : Strong hands on Photoshop. Photo Retouching Illustrator Good Selection and Masking skills. Strong computer skills. Thorough knowledge of design and aesthetic principles. Roles & Responsibilities : Photographic retouching artists will work primarily with 3D Renders. Enhance images by correcting resolution and composition, cropping images and adjusting tone, color, saturation and brightness. Adding or removing objects from an image or inserting text. Photographic retouch Artist work under the supervision of the presiding photographer. Job Type : Full-time Job Location : Pune (Balewadi, On-site) Joining : Immediate/30 Days

Posted 1 month ago

Apply

3.0 years

48 Lacs

Hyderābād

On-site

The Cloud Storage Administrator will manage and support cloud-based storage platforms in AWS and/or Azure. This role involves configuring, monitoring, and optimizing object, block, and file storage solutions to ensure high availability, performance, and data protection across our cloud infrastructure. Required Skills Administer and support cloud storage services such as Amazon S3, EBS, EFS, Glacier and Azure Blob, File and Archive Storage. Disaster mitigation design and implementation experience with a focus on architecture for cross-region replication, backup management, RTO and RPO planning and chaos engineering recovery. Demonstrate use of AWS Elastic Disaster Recovery or Azure Site Recovery. Certification and privacy standards associated with PII, data protection and compliance gap expectations. Ability to identify and tag PII, applying encryption and masking techniques and knowledge and experience in compliance certification (SOC2, ISO27001, GDPR, etc.) and demonstrate use of Azure Macie or Azure Purview. Monitoring and cost optimization practices to proactively alert on performance, usage and anomalies. Demonstrate use of AWS CloudWatch or Azure Monitor and AWS Cost Explorer or Azure Cost Management, . Embrace IaC and automation practices for backups, lifecycles, and archival polices. Demonstrate expertise with AWS CloudFormation or Azure DevOps and a history of use with Terraform modules for Cloud Storage. Manage backup and recovery processes using native cloud tools and third-party solutions. Implement storage policies including lifecycle rules, replication, and access controls. Perform capacity planning and forecasting for storage growth and utilization. Collaborate with infrastructure and application teams to meet storage and data access requirements. Ensure storage systems comply with data protection, retention, and security standards. Document configurations, procedures, and best practices for storage management. Respond to incidents and service requests related to storage systems. Participate in change and incident management processes aligned with ITSM standards. Required Experience 3+ years of experience in storage administration with cloud platforms (AWS, Azure, or both). Hands-on experience with cloud-native storage services and understanding of storage protocols. Experience with AWS CloudWatch, Azure Monitor, and the ability to set up proactive alerting on storage performance, usage, and anomalies. Strong troubleshooting and performance tuning skills related to storage. Familiarity with backup and disaster recovery solutions in cloud environments. Understanding of identity and access management as it pertains to storage services. Knowledge of ITSM processes such as incident, change, and problem management. Experienced with storage cost monitoring tools like AWS Cost Explorer or Azure Cost Management Knowledge of IaC tools (Terraform, CloudFormation) for provisioning storage resources, and automation of backup, lifecycle, and archival policies. Producing technical documentation. Exposure to enterprise backup solutions

Posted 1 month ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Do you want to help one of the most respected companies in the world reinvent its approach to data? At Thomson Reuters, we are recruiting a team of motivated data professionals to transform how we manage and leverage our commercial data assets. It is a unique opportunity to join a diverse and global team with centers of excellence in Toronto, London, and Bangalore. Are you excited about working at the forefront of the data driven revolution that will change the way a company works? Thomson Reuters Data and Analytics team is seeking an experienced Lead Engineer, Test Data Management with a passion for engineering quality assurance solutions for cloud-based data warehouse systems. About The Role As Lead Engineer, In this opportunity you will: Test Data Management, you play a crucial role in ensuring the quality and reliability of our enterprise data systems. Your expertise in testing methods, data validation, and automation are essential to bring best-in-class standards to our data products. Design test data management frameworks, apply data masking, data sub-setting, and generate synthetic data to create robust test data solutions for enterprise-wide teams. You will collaborate with Engineers, Database Architects, Data Quality Stewards to build logical data models, execute data validation, design manual and automated testing Mentor and lead the testing of key data development projects related to Data Warehouse and other systems. Lead engineering team members in implementation of test data best practices and the delivery of test data solutions. Be a thought leader investigating leading edge quality technology for test data management and systems functionality including performance testing for data pipelines. Innovate create ETL mappings, workflows, functions to move data from multiple sources into target areas. Partner across the company with analytics teams, engineering managers, architecture teams and others to design and agree on solutions that meet business requirements. Effectively communicate and liaise with other engineering groups across the organization, data consumers, and business analytic groups. Utilize your experience in the following areas: SQL for data querying, validation, and analysis Knowledge of database management systems (e.g., SQL Server, Postgresql, mySQL) Test Data Management Tools (e.g., K2View, qTest, ALM, Zephyr) Proficiency in Python for test automation and data manipulation PySpark for big data testing Test case design, execution, and defect management AWS Cloud Data practices and DevOps tooling Performance testing for data management solutions, especially for complex data flows Data Security, Privacy, and Data governance compliance principles About You You're a fit for the role of Lead Engineer, If your Job role includes: 10+ years of experience as a Tester, Developer or Data Analyst with experience in establishing end-to-end test strategies, planning for data validation, transformation, and analytics Advanced SQL Knowledge Designing and executing test procedures and documenting best practices Experience planning and executing regression testing, data validation, and quality assurance Advanced command of data warehouse creation, management, and performance strategies Experience engineering and implementing data quality systems in the cloud Proficiency in scripting language such as Python Hands on experience with data test automation applications (preference for K2View) Identification and remediation of data quality issues Data Management tools like: K2View, Immuta, Alation, Informatica Agile development Business Intelligence and Data Warehousing concepts Familiarity SAP, Salesforce systems Intermediate understanding of Big Data technologies AWS services and management, including serverless, container, queueing and monitoring services Experience with creating manual or automated tests on data pipelines Programming languages: Python Data interchange formats: Parquet, JSON, CSV Version control with GitHub Cloud security and compliance, privacy, GDPR What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 month ago

Apply

3.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Staff (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc Assist in remote and on-site gap assessment of the SIEM solution. Work on defined evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Assist in interview with stakeholders, review documents (SOPs, Architecture diagrams etc) Asist in evaluating SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure their log sources (in-scope) to be integrated to the SIEM Experience in SIEM content development which includes : Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Experience in creating custom commands, custom alert action, adaptive response actions etc Qualification & experience: Minimum of 3 years’ experience in Splunk and 3 to 5 years of overall experience with knowledge in Operating System and basic network technologies Experience in SOC as L1/L2 Analyst will be an added advantage Strong oral, written and listening skills are an essential component to effective consulting. Good to have knowledge of Vulnerability Management, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting Certification in any other SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline (CEH, Security+, etc) will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

3.0 years

0 Lacs

Kanayannur, Kerala, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Staff (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc Assist in remote and on-site gap assessment of the SIEM solution. Work on defined evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Assist in interview with stakeholders, review documents (SOPs, Architecture diagrams etc) Asist in evaluating SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure their log sources (in-scope) to be integrated to the SIEM Experience in SIEM content development which includes : Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Experience in creating custom commands, custom alert action, adaptive response actions etc Qualification & experience: Minimum of 3 years’ experience in Splunk and 3 to 5 years of overall experience with knowledge in Operating System and basic network technologies Experience in SOC as L1/L2 Analyst will be an added advantage Strong oral, written and listening skills are an essential component to effective consulting. Good to have knowledge of Vulnerability Management, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting Certification in any other SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline (CEH, Security+, etc) will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

3.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Staff (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc Assist in remote and on-site gap assessment of the SIEM solution. Work on defined evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Assist in interview with stakeholders, review documents (SOPs, Architecture diagrams etc) Asist in evaluating SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure their log sources (in-scope) to be integrated to the SIEM Experience in SIEM content development which includes : Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Experience in creating custom commands, custom alert action, adaptive response actions etc Qualification & experience: Minimum of 3 years’ experience in Splunk and 3 to 5 years of overall experience with knowledge in Operating System and basic network technologies Experience in SOC as L1/L2 Analyst will be an added advantage Strong oral, written and listening skills are an essential component to effective consulting. Good to have knowledge of Vulnerability Management, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting Certification in any other SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline (CEH, Security+, etc) will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Pune, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Pune, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: aws,analytics,sales,sql,data,snowflake,etl/elt optimization,python,data warehousing,azure,data modeling,data governance,cloud

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Thane, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Mumbai Metropolitan Region

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Thane, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Nashik, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Nashik, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Solapur, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Solapur, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure

Posted 1 month ago

Apply

0 years

2 - 3 Lacs

India

On-site

Job Title: Painter Location: Serilingampally Salary: ₹20,000 - ₹30,000 per month Job Description: We are looking for Painters to join our team at Serilingampally location. Key Responsibilities: Prepare surfaces for painting (cleaning, sanding, filling cracks and holes). Mix, match, and apply paints and finishes as per specifications. Apply primer, paints, varnishes, or other finishes using brushes, rollers, or spray guns. Protect surrounding areas using drop cloths or masking tape. Ensure high-quality finishing and attention to detail. Follow safety protocols and use protective equipment. Clean up after completing work and maintain tools and equipment. Requirements: Proven experience as a painter (residential, commercial, or industrial). Knowledge of various painting techniques and materials. Good physical condition and ability to work at heights if required. Attention to detail and precision. Ability to work independently and as part of a team. Education: No formal education required. Relevant experience is mandatory. Job Types: Full-time, Permanent Pay: ₹20,000.00 - ₹30,000.00 per month Benefits: Health insurance Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Work Location: In person

Posted 1 month ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Title : Data Warehouse Administrator Job Summary We are seeking an experienced Data Warehouse Administrator with strong expertise in Snowflake to manage, monitor, and optimize our enterprise data warehousing environment. The ideal candidate will be responsible for implementing and maintaining secure, scalable, and high-performance Snowflake solutions while ensuring data availability and reliability. Key Responsibilities Snowflake Administration : Manage Snowflake accounts, warehouses, databases, roles, and users. Monitor performance, resource usage, and optimize warehouse configurations. Handle data replication, failover, and disaster recovery setup. Data Management & Security Implement security best practices : RBAC, masking, encryption. Support data governance and compliance requirements (e.g., GDPR, HIPAA). ETL/ELT & Data Integration Support Work closely with data engineers to support data pipelines and transformations. Manage integrations between Snowflake and tools like DBT, Fivetran, Airflow, etc. Monitoring & Troubleshooting Proactively identify performance bottlenecks and resolve issues. Implement alerts, usage monitoring, and cost tracking in Snowflake. Upgrades & Maintenance Stay current with Snowflake updates and implement new features. Schedule and manage routine maintenance, backups, and data archiving. Documentation & Support Create and maintain system documentation, runbooks, and best practices. Provide L2/L3 support for data warehouse-related issues. Required Skills & Qualifications Bachelor's degree in Computer Science, Information Systems, or related field. 3-5+ years of experience with data warehouse administration. 2+ years of hands-on experience with Snowflake. Proficiency in SQL, scripting (Python or Bash), and version control (Git). Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data modeling, ELT frameworks, and CI/CD practices. Preferred Qualifications Snowflake certifications (e.g., SnowPro Core/Advanced). Experience with tools like DBT, Airflow, Fivetran, or Matillion. Exposure to data cataloging, data governance tools (e.g., Collibra, Alation). Soft Skills Strong problem-solving and analytical skills. Effective communication with technical and non-technical teams. Ability to work independently and in a team-oriented environment. (ref:hirist.tech)

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies