Jobs
Interviews

533 Masking Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 5.0 years

5 - 18 Lacs

Noida

On-site

We are seeking a skilled *Data Masking Engineer* with 4-5 years of experience in *SQL Server* and *Redgate tools* to design, implement, and manage data masking solutions. The ideal candidate will ensure sensitive data is protected while maintaining database usability for development, testing, and analytics. The candidate should be ready to relocate to Johannesburg South Africa at the earliest possible. Responsibilities Design and implement *data masking strategies* for SQL Server databases to comply with security and privacy regulations (GDPR, HIPAA, etc.). - Use *Redgate Data Masker* and other tools to anonymize sensitive data while preserving referential integrity. - Develop and maintain masking rules, scripts, and automation workflows for efficient data obfuscation. - Collaborate with *DBAs, developers, and security teams* to identify sensitive data fields and define masking policies. - Validate masked data to ensure consistency, usability, and compliance with business requirements. - Troubleshoot and optimize masking processes to minimize performance impact on production and non-production environments. - Document masking procedures, policies, and best practices for internal teams. - Stay updated with *Redgate tool updates, SQL Server features, and data security trends*. Qualifications: 4-5 years of hands-on experience in SQL Server database development/administration . - Strong expertise in *Redgate Data Masker* or similar data masking tools (e.g., Delphix, Informatica). - Proficiency in *T-SQL, PowerShell, or Python* for scripting and automation. - Knowledge of *data privacy laws (GDPR, CCPA)* and secure data handling practices. - Experience with *SQL Server security features* (Dynamic Data Masking, Always Encrypted, etc.) is a plus. - Familiarity with *DevOps/CI-CD pipelines* for automated masking in development/test environments. - Strong analytical skills to ensure masked data remains realistic for testing. Preferred Qualifications: - Redgate or Microsoft SQL Server certifications. - Experience with *SQL Server Integration Services (SSIS)* or ETL processes. - Knowledge of *cloud databases (Azure SQL, AWS RDS)* and their masking solutions. Job Type: Full-time Pay: ₹586,118.08 - ₹1,894,567.99 per year Schedule: Day shift Work Location: In person

Posted 1 month ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Hello, Greetings from ZettaMine!! 📌 Job Title: Storage Administrator – L1 / L2 / L3 Location: Hyderabad / Navi Mumbai Employment Type: Full-Time – Direct Payroll Joining: Immediate Joiners Preferred 🔹 L1 Storage Administrator – JD Experience: 0–2 Years Role Type: Entry-Level / Monitoring & Basic Support Key Responsibilities: Monitor SAN/NAS environments and raise alerts as needed. Perform basic troubleshooting and escalate unresolved issues to L2/L3 teams. Assist with daily health checks of storage devices and backup systems. Log incidents and service requests in ticketing systems (e.g., ServiceNow). Provide support in executing standard operational procedures. Skills: Basic understanding of storage systems (NetApp, Dell EMC, HPE). Familiarity with ITSM tools and ticketing workflows. Good communication and willingness to learn. Flexible with 24/7 rotational shifts. 🔹 L2 Storage Administrator – JD Experience: 2–5 Years Role Type: Mid-Level / Hands-on Operations & Troubleshooting Key Responsibilities: Perform storage provisioning, zoning, masking, and replication tasks. Manage backup and restore operations using tools like Veritas, Commvault, or NetBackup. Conduct performance tuning and space management. Troubleshoot moderately complex storage-related issues. Work with L1 team to guide and resolve tickets, and escalate to L3 when needed. Create and maintain storage documentation. Skills: Hands-on experience with SAN/NAS storage (NetApp, EMC, IBM, HPE). Good understanding of Fibre Channel, iSCSI, RAID, LUN management. Experience with backup software and snapshot technologies. Basic scripting (PowerShell, Shell, etc.) is a plus. Strong understanding of ITIL practices. 🔹 L3 Storage Administrator – JD Experience: 5+ Years Role Type: Expert / Design, Escalation & Strategy Key Responsibilities: Design and implement enterprise-grade storage solutions. Perform root cause analysis on complex incidents and provide permanent fixes. Lead capacity planning and performance optimization efforts. Work with cross-functional teams to support business continuity (DR/BCP). Oversee firmware updates, storage migrations, and automation initiatives. Mentor L1/L2 teams and develop knowledge base articles. Skills: Deep expertise in multiple storage platforms (e.g., Dell EMC VMAX/Unity, NetApp ONTAP, IBM, Hitachi). Experience in automation (Python, PowerShell) and orchestration tools. Knowledge of storage integration with virtualization platforms (VMware, Hyper-V). Strong project management and documentation skills. Excellent troubleshooting and client interaction skills Interested candidates can reach on md.afreen@zettamine.com Thanks & Regards Afreen Show more Show less

Posted 1 month ago

Apply

1.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Details: Job Description Minimum Qualification - Graduate (B.Tech) Relevant Experience - 1+ years As a System Administrator, you will have responsibilities to provide the Level 1 Technical Support. You will be the first point of contact for all infrastructure monitoring incidents and requests in a fast-paced professional environment. You will monitor, create & triage incidents as per ITIL framework. Good to have troubleshooting skills on Windows Server 2012/2019. Create documentation & SOP for easing out tasks & resolve Infrastructure related issues. You will work in a timely and efficient manner while ensuring attendance, quality and customer service metrics are met. Should have excellent written and verbal communication Good Exposure to ticketing tools (ServiceNow, JIRA, Cherwell etc.) Good troubleshooting skills on Windows/Linux OS Basic understanding on Networking and Network devices. Excellent understanding of Infrastructure/Application/Job monitoring Good exposure on monitoring tools (Zabbix/Dynatrace/SCOM/Solarwinds/Datadog/Nagios etc.) Administration of monitoring tools would be good to have skill Responsibilities: Provide L1+ support for team deliverables. Configuration, allocation, and management of centralized storage systems and other deployed storage solutions Help develop or engineer new solutions related to data storage and access Help setup of new storage technologies - tapeless, cloud storage etc. Be able to take care of most of P2/P3/P4 tasks and document the RCCA. Interacting with client over calls and provide details and reasoning of the deliverables. Troubleshooting experience and ability to fulfill L2 support Good knowledge on multiple Backup and Storage Systems like: Knowledge of Commvault, Rubrik, Veeam, r etc EMC, Netapp, Oracle Storage appliences, IBM SAN. etc. Experience with monitoring and performance tuning of various storage systems Experience with capacity planning and storage allocation forecasting Experience in storage Operations management, backup systems, SAN, NAS, DR Working knowledge of systems from major data storage vendors and cloud services Working knowledge of technical aspects of storage systems - including but not limited to DAS, NAS, Fiber Channel, RAID config, LUN Masking Basic Scripting Good knowledge in Windows and Unix OS Working Knowledge of VMware/Hyper-V. Job Requirements Responsibilities: Provide L1+ support for team deliverables. Configuration, allocation, and management of centralized storage systems and other deployed storage solutions Help develop or engineer new solutions related to data storage and access Help setup of new storage technologies - tapeless, cloud storage etc. Be able to take care of most of P2/P3/P4 tasks and document the RCCA. Interacting with client over calls and provide details and reasoning of the deliverables. Troubleshooting experience and ability to fulfill L2 support Good knowledge on multiple Backup and Storage Systems like: Knowledge of Commvault, Rubrik, Veeam, r etc EMC, Netapp, Oracle Storage appliences, IBM SAN. etc. Experience with monitoring and performance tuning of various storage systems Experience with capacity planning and storage allocation forecasting Experience in storage Operations management, backup systems, SAN, NAS, DR Working knowledge of systems from major data storage vendors and cloud services Working knowledge of technical aspects of storage systems - including but not limited to DAS, NAS, Fiber Channel, RAID config, LUN Masking Basic Scripting Good knowledge in Windows and Unix OS Working Knowledge of VMware/Hyper-V. Show more Show less

Posted 1 month ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: L1 Storage Administrator Location: Hyderabad / Navi Mumbai Experience: 1–2 Years Employment Type: Full-time Job Summary: We are looking for an enthusiastic and detail-oriented L1 Storage Administrator to provide first-level support for storage infrastructure. The ideal candidate will assist in monitoring, basic troubleshooting, and operational tasks for enterprise storage systems. Key Responsibilities: Monitor storage devices and alerts through tools and dashboards. Perform basic health checks and log reviews for SAN/NAS environments. Coordinate with L2/L3 teams for incident escalation. Maintain asset inventory, storage usage reports, and documentation. Support routine tasks like tape management, backup monitoring, and scheduled maintenance. Requirements: Basic understanding of storage systems (NetApp, EMC, HPE, etc.). Familiarity with backup software (Commvault, Veritas, etc.). Knowledge of ITIL processes (incident, change, and problem management). Willingness to work in shifts and on-call rotations. Job Title: L2 Storage Administrator Location: Hyderabad / Navi Mumbai Experience: 3–5 Years Employment Type: Full-time Job Summary: We are seeking a skilled L2 Storage Administrator responsible for the day-to-day operations, configuration, and troubleshooting of enterprise storage systems. Key Responsibilities: Perform provisioning and de-provisioning of SAN/NAS storage. Troubleshoot medium-complexity storage and backup issues. Execute and validate backup/restore tasks and replication jobs. Work closely with server, network, and application teams to resolve performance issues. Implement data migration, upgrades, and firmware updates under L3 guidance. Requirements: Experience with enterprise storage platforms (NetApp, Dell EMC, HPE 3PAR/Primera). Hands-on experience with backup and recovery solutions (e.g., Commvault, Veeam). Basic scripting knowledge (PowerShell, Bash) is a plus. Strong knowledge of zoning, LUN masking, and volume management. Ability to create detailed documentation and reports. Job Title: L3 Storage Administrator Location: Hyderabad / Navi Mumbai Experience: 6–10+ Years Employment Type: Full-time Job Summary: We are hiring a highly experienced L3 Storage Administrator to design, manage, and troubleshoot complex storage infrastructures. You will serve as the highest level of technical escalation and work on capacity planning, architecture, and performance optimization. Key Responsibilities: Lead the architecture and implementation of storage solutions. Perform root cause analysis and provide permanent resolutions for critical incidents. Design and implement DR/BCP strategies and data lifecycle management. Collaborate with vendors for escalations, patches, and feature enhancements. Evaluate and recommend new storage technologies and tools. Mentor L1/L2 admins and contribute to SOP development. Requirements: Deep expertise in enterprise storage (EMC VMAX, NetApp AFF, HPE 3PAR, Pure Storage). Strong knowledge of SAN fabric (Brocade/Cisco), multipathing, and zoning. Proficient in backup/recovery and disaster recovery technologies. Scripting and automation (PowerShell, Python) experience preferred. IT certifications (e.g., NetApp NCDA, EMC Proven, SNIA, Veeam) highly preferred. How to Apply: 📧 Send your updated resume to: latha.a@zettamine.com Please include the following in your email: Full Name Contact Number Total Experience Relevant Experience Current CTC Expected CTC Notice Period #storageadmin #L1 #L2 #L3 #storage Show more Show less

Posted 1 month ago

Apply

9.0 years

5 - 10 Lacs

Thiruvananthapuram

On-site

9 - 12 Years 1 Opening Trivandrum Role description Role Proficiency: Leverage expertise in a technology area (e.g. Infromatica Transformation Terradata data warehouse Hadoop Analytics) Responsible for Architecture for a small/mid-size projects. Outcomes: Implement either data extract and transformation a data warehouse (ETL Data Extracts Data Load Logic Mapping Work Flows stored procedures data warehouse) data analysis solution data reporting solutions or cloud data tools in any one of the cloud providers(AWS/AZURE/GCP) Understand business workflows and related data flows. Develop design for data acquisitions and data transformation or data modelling; applying business intelligence on data or design data fetching and dashboards Design information structure work-and dataflow navigation. Define backup recovery and security specifications Enforce and maintain naming standards and data dictionary for data models Provide or guide team to perform estimates Help team to develop proof of concepts (POC) and solution relevant to customer problems. Able to trouble shoot problems while developing POCs Architect/Big Data Speciality Certification in (AWS/AZURE/GCP/General for example Coursera or similar learning platform/Any ML) Measures of Outcomes: Percentage of billable time spent in a year for developing and implementing data transformation or data storage Number of best practices documented in any new tool and technology emerging in the market Number of associates trained on the data service practice Outputs Expected: Strategy & Planning: Create or contribute short-term tactical solutions to achieve long-term objectives and an overall data management roadmap Implement methods and procedures for tracking data quality completeness redundancy and improvement Ensure that data strategies and architectures meet regulatory compliance requirements Begin engaging external stakeholders including standards organizations regulatory bodies operators and scientific research communities or attend conferences with respect to data in cloud Operational Management : Help Architects to establish governance stewardship and frameworks for managing data across the organization Provide support in implementing the appropriate tools software applications and systems to support data technology goals Collaborate with project managers and business teams for all projects involving enterprise data Analyse data-related issues with systems integration compatibility and multi-platform integration Project Control and Review : Provide advice to teams facing complex technical issues in the course of project delivery Define and measure project and program specific architectural and technology quality metrics Knowledge Management & Capability Development : Publish and maintain a repository of solutions best practices and standards and other knowledge articles for data management Conduct and facilitate knowledge sharing and learning sessions across the team Gain industry standard certifications on technology or area of expertise Support technical skill building (including hiring and training) for the team based on inputs from project manager /RTE’s Mentor new members in the team in technical areas Gain and cultivate domain expertise to provide best and optimized solution to customer (delivery) Requirement gathering and Analysis: Work with customer business owners and other teams to collect analyze and understand the requirements including NFRs/define NFRs Analyze gaps/ trade-offs based on current system context and industry practices; clarify the requirements by working with the customer Define the systems and sub-systems that define the programs People Management: Set goals and manage performance of team engineers Provide career guidance to technical specialists and mentor them Alliance Management: Identify alliance partners based on the understanding of service offerings and client requirements In collaboration with Architect create a compelling business case around the offerings Conduct beta testing of the offerings and relevance to program Technology Consulting: In collaboration with Architects II and III analyze the application and technology landscapers process and tolls to arrive at the architecture options best fit for the client program Analyze Cost Vs Benefits of solution options Support Architects II and III to create a technology/ architecture roadmap for the client Define Architecture strategy for the program Innovation and Thought Leadership: Participate in internal and external forums (seminars paper presentation etc) Understand clients existing business at the program level and explore new avenues to save cost and bring process efficiency Identify business opportunities to create reusable components/accelerators and reuse existing components and best practices Project Management Support: Assist the PM/Scrum Master/Program Manager to identify technical risks and come-up with mitigation strategies Stakeholder Management: Monitor the concerns of internal stakeholders like Product Managers & RTE’s and external stakeholders like client architects on Architecture aspects. Follow through on commitments to achieve timely resolution of issues Conduct initiatives to meet client expectations Work to expand professional network in the client organization at team and program levels New Service Design: Identify potential opportunities for new service offerings based on customer voice/ partner inputs Conduct beta testing / POC as applicable Develop collaterals guides for GTM Skill Examples: Use data services knowledge creating POC to meet a business requirements; contextualize the solution to the industry under guidance of Architects Use technology knowledge to create Proof of Concept (POC) / (reusable) assets under the guidance of the specialist. Apply best practices in own area of work helping with performance troubleshooting and other complex troubleshooting. Define decide and defend the technology choices made review solution under guidance Use knowledge of technology t rends to provide inputs on potential areas of opportunity for UST Use independent knowledge of Design Patterns Tools and Principles to create high level design for the given requirements. Evaluate multiple design options and choose the appropriate options for best possible trade-offs. Conduct knowledge sessions to enhance team's design capabilities. Review the low and high level design created by Specialists for efficiency (consumption of hardware memory and memory leaks etc.) Use knowledge of Software Development Process Tools & Techniques to identify and assess incremental improvements for software development process methodology and tools. Take technical responsibility for all stages in the software development process. Conduct optimal coding with clear understanding of memory leakage and related impact. Implement global standards and guidelines relevant to programming and development come up with 'points of view' and new technological ideas Use knowledge of Project Management & Agile Tools and Techniques to support plan and manage medium size projects/programs as defined within UST; identifying risks and mitigation strategies Use knowledge of Project Metrics to understand relevance in project. Collect and collate project metrics and share with the relevant stakeholders Use knowledge of Estimation and Resource Planning to create estimate and plan resources for specific modules or small projects with detailed requirements or user stories in place Strong proficiencies in understanding data workflows and dataflow Attention to details High analytical capabilities Knowledge Examples: Data visualization Data migration RDMSs (relational database management systems SQL Hadoop technologies like MapReduce Hive and Pig. Programming languages especially Python and Java Operating systems like UNIX and MS Windows. Backup/archival software. Additional Comments: Snowflake Architect Key Responsibilities: • Solution Design: Designing the overall data architecture within Snowflake, including database/schema structures, data flow patterns (ELT/ETL strategies involving Snowflake), and integration points with other systems (source systems, BI tools, data science platforms). • Data Modeling: Designing efficient and scalable physical data models within Snowflake. Defining table structures, distribution/clustering keys, data types, and constraints to optimize storage and query performance. • Security Architecture: Designing the overall security framework, including the RBAC strategy, data masking policies, encryption standards, and how Snowflake security integrates with broader enterprise security policies. • Performance and Scalability Strategy: Designing solutions with performance and scalability in mind. Defining warehouse sizing strategies, query optimization patterns, and best practices for development teams. Ensuring the architecture can handle future growth in data volume and user concurrency. • Cost Optimization Strategy: Designing architectures that are inherently cost-effective. Making strategic choices about data storage, warehouse usage patterns, and feature utilization (e.g., when to use materialized views, streams, tasks). • Technology Evaluation and Selection: Evaluating and recommending specific Snowflake features (e.g., Snowpark, Streams, Tasks, External Functions, Snowpipe) and third-party tools (ETL/ELT, BI, governance) that best fit the requirements. • Standards and Governance: Defining best practices, naming conventions, development guidelines, and governance policies for using Snowflake effectively and consistently across the organization. • Roadmap and Strategy: Aligning the Snowflake data architecture with overall business intelligence and data strategy goals. Planning for future enhancements and platform evolution. • Technical Leadership: Providing guidance and mentorship to developers, data engineers, and administrators working with Snowflake. • Key Skills: • Deep understanding of Snowflake's advanced features and architecture. • Strong data warehousing concepts and data modeling expertise. • Solution architecture and system design skills. • Experience with cloud platforms (AWS, Azure, GCP) and how Snowflake integrates. • Expertise in performance tuning principles and techniques at an architectural level. • Strong understanding of data security principles and implementation patterns. • Knowledge of various data integration patterns (ETL, ELT, Streaming). • Excellent communication and presentation skills to articulate designs to technical and non-technical audiences. • Strategic thinking and planning abilities. Looking for 12+ years of experience to join our team. Skills Snowflake,Data modeling,Cloud platforms,Solution architecture About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 1 month ago

Apply

4.0 - 7.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Introduction In this role, you will work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. You will collaborate with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you will be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground-breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience Your Role And Responsibilities A Data Engineer specializing in enterprise data platforms, experienced in building, managing, and optimizing data pipelines for large-scale environments. Having expertise in big data technologies, distributed computing, data ingestion, and transformation frameworks. Proficient in Apache Spark, PySpark, Kafka, and Iceberg tables, and understand how to design and implement scalable, high-performance data processing solutions.What you’ll do: As a Data Engineer – Data Platform Services, responsibilities include: Data Ingestion & Processing Designing and developing data pipelines to migrate workloads from IIAS to Cloudera Data Lake. Implementing streaming and batch data ingestion frameworks using Kafka, Apache Spark (PySpark). Working with IBM CDC and Universal Data Mover to manage data replication and movement. Big Data & Data Lakehouse Management Implementing Apache Iceberg tables for efficient data storage and retrieval. Managing distributed data processing with Cloudera Data Platform (CDP). Ensuring data lineage, cataloging, and governance for compliance with Bank/regulatory policies. Optimization & Performance Tuning Optimizing Spark and PySpark jobs for performance and scalability. Implementing data partitioning, indexing, and caching to enhance query performance. Monitoring and troubleshooting pipeline failures and performance bottlenecks. Security & Compliance Ensuring secure data access, encryption, and masking using Thales CipherTrust. Implementing role-based access controls (RBAC) and data governance policies. Supporting metadata management and data quality initiatives. Collaboration & Automation Working closely with Data Scientists, Analysts, and DevOps teams to integrate data solutions. Automating data workflows using Airflow and implementing CI/CD pipelines with GitLab and Sonatype Nexus. Supporting Denodo-based data virtualization for seamless data access Preferred Education Master's Degree Required Technical And Professional Expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks Preferred Technical And Professional Experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics Show more Show less

Posted 1 month ago

Apply

4.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

We are seeking a skilled *Data Masking Engineer* with 4-5 years of experience in *SQL Server* and *Redgate tools* to design, implement, and manage data masking solutions. The ideal candidate will ensure sensitive data is protected while maintaining database usability for development, testing, and analytics. The candidate should be ready to relocate to Johannesburg South Africa at the earliest possible. Responsibilities Design and implement *data masking strategies* for SQL Server databases to comply with security and privacy regulations (GDPR, HIPAA, etc.). - Use *Redgate Data Masker* and other tools to anonymize sensitive data while preserving referential integrity. - Develop and maintain masking rules, scripts, and automation workflows for efficient data obfuscation. - Collaborate with *DBAs, developers, and security teams* to identify sensitive data fields and define masking policies. - Validate masked data to ensure consistency, usability, and compliance with business requirements. - Troubleshoot and optimize masking processes to minimize performance impact on production and non-production environments. - Document masking procedures, policies, and best practices for internal teams. - Stay updated with *Redgate tool updates, SQL Server features, and data security trends*. Qualifications: 4-5 years of hands-on experience in SQL Server database development/administration . - Strong expertise in *Redgate Data Masker* or similar data masking tools (e.g., Delphix, Informatica). - Proficiency in *T-SQL, PowerShell, or Python* for scripting and automation. - Knowledge of *data privacy laws (GDPR, CCPA)* and secure data handling practices. - Experience with *SQL Server security features* (Dynamic Data Masking, Always Encrypted, etc.) is a plus. - Familiarity with *DevOps/CI-CD pipelines* for automated masking in development/test environments. - Strong analytical skills to ensure masked data remains realistic for testing. Preferred Qualifications: - Redgate or Microsoft SQL Server certifications. - Experience with *SQL Server Integration Services (SSIS)* or ETL processes. - Knowledge of *cloud databases (Azure SQL, AWS RDS)* and their masking solutions. Show more Show less

Posted 1 month ago

Apply

8.0 - 11.0 years

6 - 9 Lacs

Noida

On-site

Snowflake - Senior Technical Lead Full-time Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 50,000 employees in nearly 30 countries, is recognised for its consulting, digital services and solutions. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organisations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2024, the Group generated revenues of €5.8 billion. The world is how we shape it. Job Description Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Preferred Skills Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. Qualifications BTech/MCA Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities.

Posted 1 month ago

Apply

12.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Role Proficiency: Leverage expertise in a technology area (e.g. Infromatica Transformation Terradata data warehouse Hadoop Analytics) Responsible for Architecture for a small/mid-size projects. Outcomes Implement either data extract and transformation a data warehouse (ETL Data Extracts Data Load Logic Mapping Work Flows stored procedures data warehouse) data analysis solution data reporting solutions or cloud data tools in any one of the cloud providers(AWS/AZURE/GCP) Understand business workflows and related data flows. Develop design for data acquisitions and data transformation or data modelling; applying business intelligence on data or design data fetching and dashboards Design information structure work-and dataflow navigation. Define backup recovery and security specifications Enforce and maintain naming standards and data dictionary for data models Provide or guide team to perform estimates Help team to develop proof of concepts (POC) and solution relevant to customer problems. Able to trouble shoot problems while developing POCs Architect/Big Data Speciality Certification in (AWS/AZURE/GCP/General for example Coursera or similar learning platform/Any ML) Measures Of Outcomes Percentage of billable time spent in a year for developing and implementing data transformation or data storage Number of best practices documented in any new tool and technology emerging in the market Number of associates trained on the data service practice Outputs Expected Strategy & Planning: Create or contribute short-term tactical solutions to achieve long-term objectives and an overall data management roadmap Implement methods and procedures for tracking data quality completeness redundancy and improvement Ensure that data strategies and architectures meet regulatory compliance requirements Begin engaging external stakeholders including standards organizations regulatory bodies operators and scientific research communities or attend conferences with respect to data in cloud Operational Management Help Architects to establish governance stewardship and frameworks for managing data across the organization Provide support in implementing the appropriate tools software applications and systems to support data technology goals Collaborate with project managers and business teams for all projects involving enterprise data Analyse data-related issues with systems integration compatibility and multi-platform integration Project Control And Review Provide advice to teams facing complex technical issues in the course of project delivery Define and measure project and program specific architectural and technology quality metrics Knowledge Management & Capability Development Publish and maintain a repository of solutions best practices and standards and other knowledge articles for data management Conduct and facilitate knowledge sharing and learning sessions across the team Gain industry standard certifications on technology or area of expertise Support technical skill building (including hiring and training) for the team based on inputs from project manager /RTE’s Mentor new members in the team in technical areas Gain and cultivate domain expertise to provide best and optimized solution to customer (delivery) Requirement Gathering And Analysis Work with customer business owners and other teams to collect analyze and understand the requirements including NFRs/define NFRs Analyze gaps/ trade-offs based on current system context and industry practices; clarify the requirements by working with the customer Define the systems and sub-systems that define the programs People Management Set goals and manage performance of team engineers Provide career guidance to technical specialists and mentor them Alliance Management Identify alliance partners based on the understanding of service offerings and client requirements In collaboration with Architect create a compelling business case around the offerings Conduct beta testing of the offerings and relevance to program Technology Consulting In collaboration with Architects II and III analyze the application and technology landscapers process and tolls to arrive at the architecture options best fit for the client program Analyze Cost Vs Benefits of solution options Support Architects II and III to create a technology/ architecture roadmap for the client Define Architecture strategy for the program Innovation And Thought Leadership Participate in internal and external forums (seminars paper presentation etc) Understand clients existing business at the program level and explore new avenues to save cost and bring process efficiency Identify business opportunities to create reusable components/accelerators and reuse existing components and best practices Project Management Support Assist the PM/Scrum Master/Program Manager to identify technical risks and come-up with mitigation strategies Stakeholder Management Monitor the concerns of internal stakeholders like Product Managers & RTE’s and external stakeholders like client architects on Architecture aspects. Follow through on commitments to achieve timely resolution of issues Conduct initiatives to meet client expectations Work to expand professional network in the client organization at team and program levels New Service Design Identify potential opportunities for new service offerings based on customer voice/ partner inputs Conduct beta testing / POC as applicable Develop collaterals guides for GTM Skill Examples Use data services knowledge creating POC to meet a business requirements; contextualize the solution to the industry under guidance of Architects Use technology knowledge to create Proof of Concept (POC) / (reusable) assets under the guidance of the specialist. Apply best practices in own area of work helping with performance troubleshooting and other complex troubleshooting. Define decide and defend the technology choices made review solution under guidance Use knowledge of technology t rends to provide inputs on potential areas of opportunity for UST Use independent knowledge of Design Patterns Tools and Principles to create high level design for the given requirements. Evaluate multiple design options and choose the appropriate options for best possible trade-offs. Conduct knowledge sessions to enhance team's design capabilities. Review the low and high level design created by Specialists for efficiency (consumption of hardware memory and memory leaks etc.) Use knowledge of Software Development Process Tools & Techniques to identify and assess incremental improvements for software development process methodology and tools. Take technical responsibility for all stages in the software development process. Conduct optimal coding with clear understanding of memory leakage and related impact. Implement global standards and guidelines relevant to programming and development come up with 'points of view' and new technological ideas Use knowledge of Project Management & Agile Tools and Techniques to support plan and manage medium size projects/programs as defined within UST; identifying risks and mitigation strategies Use knowledge of Project Metrics to understand relevance in project. Collect and collate project metrics and share with the relevant stakeholders Use knowledge of Estimation and Resource Planning to create estimate and plan resources for specific modules or small projects with detailed requirements or user stories in place Strong proficiencies in understanding data workflows and dataflow Attention to details High analytical capabilities Knowledge Examples Data visualization Data migration RDMSs (relational database management systems SQL Hadoop technologies like MapReduce Hive and Pig. Programming languages especially Python and Java Operating systems like UNIX and MS Windows. Backup/archival software. Additional Comments Snowflake Architect Key Responsibilities: Solution Design: Designing the overall data architecture within Snowflake, including database/schema structures, data flow patterns (ELT/ETL strategies involving Snowflake), and integration points with other systems (source systems, BI tools, data science platforms). Data Modeling: Designing efficient and scalable physical data models within Snowflake. Defining table structures, distribution/clustering keys, data types, and constraints to optimize storage and query performance. Security Architecture: Designing the overall security framework, including the RBAC strategy, data masking policies, encryption standards, and how Snowflake security integrates with broader enterprise security policies. Performance and Scalability Strategy: Designing solutions with performance and scalability in mind. Defining warehouse sizing strategies, query optimization patterns, and best practices for development teams. Ensuring the architecture can handle future growth in data volume and user concurrency. Cost Optimization Strategy: Designing architectures that are inherently cost-effective. Making strategic choices about data storage, warehouse usage patterns, and feature utilization (e.g., when to use materialized views, streams, tasks). Technology Evaluation and Selection: Evaluating and recommending specific Snowflake features (e.g., Snowpark, Streams, Tasks, External Functions, Snowpipe) and third-party tools (ETL/ELT, BI, governance) that best fit the requirements. Standards and Governance: Defining best practices, naming conventions, development guidelines, and governance policies for using Snowflake effectively and consistently across the organization. Roadmap and Strategy: Aligning the Snowflake data architecture with overall business intelligence and data strategy goals. Planning for future enhancements and platform evolution. Technical Leadership: Providing guidance and mentorship to developers, data engineers, and administrators working with Snowflake. Key Skills: Deep understanding of Snowflake's advanced features and architecture. Strong data warehousing concepts and data modeling expertise. Solution architecture and system design skills. Experience with cloud platforms (AWS, Azure, GCP) and how Snowflake integrates. Expertise in performance tuning principles and techniques at an architectural level. Strong understanding of data security principles and implementation patterns. Knowledge of various data integration patterns (ETL, ELT, Streaming). Excellent communication and presentation skills to articulate designs to technical and non-technical audiences. Strategic thinking and planning abilities. Looking for 12+ years of experience to join our team. Skills Snowflake,Data modeling,Cloud platforms,Solution architecture Show more Show less

Posted 1 month ago

Apply

8.0 - 11.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 50,000 employees in nearly 30 countries, is recognised for its consulting, digital services and solutions. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organisations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2024, the Group generated revenues of €5.8 billion. Job Description The world is how we shape it. Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Preferred Skills Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. Qualifications BTech/MCA Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Delhi, India

On-site

Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude. Show more Show less

Posted 1 month ago

Apply

0.0 years

0 Lacs

Gurugram, Haryana

On-site

202408432 Gurugram, Haryana, India Thane, Maharashtra, India Bevorzugt Description The Role: Partner with other architecture resources to lead the end-to-end architecture of the health and benefits data platform using Azure services, ensuring scalability, flexibility, and reliability. Develop broad understanding of the data lake architecture, including the impact of changes on a whole system, the onboarding of clients and the security implications of the solution. Design a new or improve upon existing architecture including data ingestion, storage, transformation and consumption layers. Define data models, schemas, and database structures optimized for H&B use cases including claims, census, placement, broking and finance sources. Designing solutions for seamless integration of diverse health and benefits data sources. Implement data governance and security best practices in compliance with industry standards and regulations using Microsoft Purview. Evaluate data lake architecture to understand how technical decisions may impact business outcomes and suggest new solutions/technologies that better align to the Health and Benefits Data strategy. Draw on internal and external practices to establish data lake architecture best practices and standards within the team and ensure that they are shared and understood. Continuously develop technical knowledge and be recognised as a key resource across the global team. Collaborate with other specialists and/or technical experts to ensure H&B Data Platform is delivering to the highest possible standards and that solutions support stakeholder needs and business requirements. Initiate practices that will increase code quality, performance and security. Develop recommendations for continuous improvements initiatives, applying deep subject matter knowledge to provide guidance at all levels on the potential implications of changes. Build the team’s technical expertise/capabilities/skills through the delivery of regular feedback, knowledge sharing, and coaching. High learning adaptability, demonstrating understanding of the implications of technical issues on business requirements and / or operations. Analyze existing data design and suggest improvements that promote performance, stability and interoperability. Work with product management and business subject matter experts to translate business requirements into good data lake design. Maintain the governance model on the data lake architecture through training, design reviews, code reviews, and progress reviews. Participate in the development of Data lake Architecture and Roadmaps in support of business strategies Communication with key stakeholders and development teams on technical solutions. Convince and present proposals by way of high-level solutions to end users and/or stakeholders. The Requirement: Candidate must have significant experience in a technology related discipline, such as IT or Engineering with a Bachelor's/College Degree in these areas being beneficial. Strong experience in databases, tools and methodologies Strong skills across a broad range of database technologies including Azure Data Factory, Azure Synapse Analytics, Azure Databricks, Azure SQL Database, Azure Data Lake Storage, and other Azure Services. Working knowledge of Microsoft Fabric is preferred. Data Analysis, Data Modeling, Data Integration, Data Warehousing, Database Design Experience with database performance evaluation and remediation Develop strategies for data acquisitions, archive recovery and implementation Be able to design and develop Databases, Data Warehouses and Multidimensional Databases Experience in Data Governance including Microsoft Purview, Azure Data Catalogue, Azure Data Share, and other Azure tools. Familiarity with legal risks related to data usage and rights. Experience in data security, including Azure Key Vault, Azure Data Encryption, Azure Data Masking, Azure Data Anonymization, and Azure Active Directory. Ability to develop database strategies for flexible high-performance reporting and business intelligence Experience using data modeling tools & methodology Experience working within an Agile Scrum Development Life Cycle, across varying levels of Agile maturity Experience working with geographically distributed scrum teams Excellent verbal and writing skills, including the ability to research, design, and write new documentation, as well as to maintain and improve existing material Technical competencies: Subject Matter Expertise Developing expertise You strengthen your depth and/or breadth of subject matter knowledge and skills across multiple areas. You define the expertise required in your area based on emerging technologies, industry practices. You build the team’s capability accordingly. Applying expertise You apply subject matter knowledge and skills across multiple areas to assess the impact of complex issues and implement long-term solutions. You foster innovation using subject matter knowledge to enhance tools, practices, and processes for the team. Solution Development Systems thinking You lead and foster collaboration across H&B Data Platform Technology to develop solutions to complex issues. You apply a whole systems approach to evaluating impact, and take ownership for ensuring links between structure, people and processes are made. Focusing on quality You instill a quality mindset to the team and ensure the appropriate methods, processes and standards are in place for teams to deliver quality solutions. You create and deliver improvement initiatives. Technical Communication Simplifying complexity You develop tools, aids and/or original content to support the delivery and/or understanding of complex information. You guide others on best practice. Qualifications Candidate must have significant experience in a technology related discipline, such as IT or Engineering with a Bachelor's/College Degree in these areas being beneficial.

Posted 1 month ago

Apply

9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description Job Title: Lead - Data Analyst The Purpose of this Role The Workplace Investment (WI) Test Data Management CoE is seeking an experienced Data Analyst to lead data exploration, reporting, and quality initiatives supporting test environments. You will analyze trends, detect anomalies, and provide actionable insights to help improve test data availability, accuracy, and data/ environment stability. You will partner with engineering, QA, and TDM teams to ensure data decisions are grounded in reliable, timely information. The Expertise And Skills You Bring 6–9 years of experience as a Data Analyst, preferably in a testing, QA, or engineering environment Strong SQL skills and experience working with large, complex datasets Experience with reporting tools (Tableau, Power BI, or similar) You must possess strong background on PL/SQL, Informatica, Snowflake Ability to investigate data issues, identify patterns, and explain root causes Familiarity with test data concepts, data masking, or data provisioning processes is a plus Strong communication and data storytelling skills — can explain findings to technical and non-technical teams Experience working with Agile teams, JIRA, and documentation tools (e.g., Confluence) Exposure to cloud platforms (AWS, Azure) and scripting languages (e.g., Python) is desirable Passion for data quality, consistency, and driving better decisions with facts The Value You Deliver You perform in-depth analysis of test data patterns, usage, and gaps across environments You detect anomalies or inconsistencies in data and help teams address root causes You build clear, insightful dashboards and reports to support data-driven decision-making You partner with QE, TDM, and engineering teams to define test data quality metrics You support test data compliance and masking efforts by identifying sensitive data across sources You contribute to environment stability by flagging issues early and helping teams prioritize fixes You document insights, trends, and recurring data issues to support long-term improvements How Your Work Impacts The Organization The Team The WI QE Center of Excellence (CoE) drives engineering excellence and data/environment stability across WI squads. As a Data Analyst, you’ll play a key role in ensuring test data is trustworthy, visible, and ready to support rapid, high-quality testing. Your insights will help shape how we build, monitor, and evolve our test data ecosystem. Location: TRIL, Chennai Timing: 11AM to 8PM Certifications Category: Information Technology Show more Show less

Posted 1 month ago

Apply

1.0 - 4.0 years

1 - 5 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Informatica TDM 1. Data Discovery, Data Subset and Data Masking 2. Data generation 3. Complex masking and genration rule creation 4. Performance tunning for Inoformatica Mapping 5. Debugging with Informatica power center 6. Data Migration Skills and Knowledge 1. Informatica TDM development experiance in Data masking, discovery, Data subsetting and Data Generation is must 2. Should have experiance in working with flat file, MS SQL, Oracle and Snowflake 3. Debugging using Inofmratica Power center 4. Experiance in Tableau will be added advantage 5. Should have basic knowledge about IICS" 6. Must have and Good to have skills Informatica TDM, SQL, Informatica Powercenter, GDPR

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 3 to 6 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

0.0 - 1.0 years

0 - 3 Lacs

Noida

Work from Office

- Knowledge of Clipping Path (pen tool) is must - Selection tool, Color Correction, Cropping, Retouching, Re-sizing basic knowledge - Should be comfortable to work in day/night shifts (extra perks for extra hrs) Required Candidate profile - Ready to learn new things - Fresher can apply - Knowledge of Adobe Photoshop-pen tool is must - Good incentives for working extra hrs

Posted 1 month ago

Apply

20.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Senior Data Solution Architect Job Summary: The Senior Data Solution Architect is a visionary and technical leader responsible for designing and guiding enterprise-scale data solutions. Leveraging 20+ years of experience, this individual works closely with business and IT stakeholders to deliver scalable, secure, and high-performing data architectures that support strategic goals, data-driven innovation, and digital transformation. This role encompasses solution design, platform modernization, cloud data architecture, and deep integration with enterprise systems. Key Responsibilities: Solution Architecture & Design Lead the end-of-the-end architecture of complex data solutions across domains including analytics, AI/ML, MDM, and real-time processing. Design robust, scalable, and future-ready data architectures using modern technologies (e.g., cloud data platforms, streaming, NoSQL, graph databases). Deliver solutions that balance performance, scalability, security, and cost-efficiency. Enterprise Data Integration Architect seamless data integration across legacy systems, SaaS platforms, IoT, APIs, and third-party data sources. Define and implement enterprise-wide ETL/ELT strategies using tools like Informatica, Talend, DBT, Azure Data Factory, or AWS Glue. Support real-time and event-driven architecture with tools such as Kafka, Spark Streaming, or Flink. Cloud Data Platforms & Infrastructure Design cloud-native data solutions on AWS, Azure, or GCP (e.g., Redshift, Snowflake, BigQuery, Databricks, Synapse). Lead cloud migration strategies from legacy systems to modern, cloud-based data architectures. Define standards for cloud data governance, cost management, and performance optimization. Data Governance, Security & Compliance Partner with governance teams to enforce enterprise data governance frameworks. Ensure solutions comply with regulations such as GDPR, HIPAA, CCPA, and industry-specific mandates. Embed security and privacy by design in data architectures (encryption, role-based access, masking, etc.). Technical Leadership & Stakeholder Engagement Serve as a technical advisor to CIOs, CDOs, and senior business executives on data strategy and platform decisions. Mentor architecture and engineering teams; provide guidance on solution patterns and best practices. Facilitate architecture reviews, proof-of-concepts (POCs), and technology evaluations. Innovation & Continuous Improvement Stay abreast of emerging trends in data engineering, AI, data mesh, data fabric, and edge computing. Evaluate and introduce innovative tools and patterns (e.g., serverless data pipelines, federated data access). Drive architectural modernization, legacy decommissioning, and platform simplification. Qualifications: Education: Bachelor’s degree in computer science, Engineering, Information Systems, or related field; Master’s or MBA preferred. Experience: 20+ years in IT with at least 10 years in data architecture or solution architecture roles. Demonstrated experience in large-scale, complex data platform architecture and enterprise transformations. Deep experience with multiple database technologies (SQL, NoSQL, columnar, time series). Strong programming/scripting background (e.g., Python, Scala, Java, SQL). Proven experience architecting on at least one major cloud provider (AWS, Azure, GCP). Familiarity with DevOps, CI/CD, and DataOps practices. Preferred Certifications: AWS/Azure/GCP Solution Architect (Professional level preferred) TOGAF or Zachman Framework Certification Snowflake/Databricks Certified Architect CDMP (Certified Data Management Professional) or DGSP Key Competencies: Strategic and conceptual thinking with the ability to translate business needs into technical solutions. Exceptional communication, presentation, and negotiation skills. Leadership in cross-functional teams and matrix environments. Deep understanding of business processes, data monetization, and digital strategy. Success Indicators: Delivery of transformative data platforms that enhance analytics and decision-making. Improved data integration, quality, and access across the enterprise. Successful migration to cloud-native or hybrid architectures. Reduction of technical debt and legacy system dependencies. Increased reuse of solution patterns, accelerators, and frameworks. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners. Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude. Show more Show less

Posted 1 month ago

Apply

40.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description The candidate must have strong troubleshooting skills on Database and Database technology products Expertise in Performance issue analysis and providing resolution Guide customer on Oracle Database Best practices Should possess knowledge on implementation and supporting on Database Security Products like Transparent Data Encryption, Redaction, Data Vault, Masking. Possess strong troubleshooting skills on Real Application Cluster Should be able to guide and mentor team of engineers on Database Technology products Should possess knowledge and be able to articulate to customer the use cases of Advanced Compression, In-memory Knowledge on Oracle Enterprise Manager Personal Skills Strong experience in service delivery and/or project management is required. Oracle products and services knowledge will be highly appreciated as well as experience in Oracle HW platforms and OS. Experience on Enterprise Customers is required Excellent communication / relationship building skills Customer focused and results oriented Ability to work under pressure in highly escalated situations Organized with strong attention to detail Decision making / problem solving skills Ability to manage multiple concurrent activities (customer engagements) Highly professional: Ability to deal with senior and exec stakeholders with confidence Strong analytic skills and ability to pre-empt potential risks and issues Career Level - IC4 Responsibilities RESPONSIBILITIES Be the single point of contact within Oracle for the customer, acting as their advocate for the service you are responsible for delivering. The CSS TAM is a customer advocate and must demonstrate customer obsession by placing the client needs first. Provide technical guidance and be part of the customer calls/meeting on adoption of database technology Should possess strong technical skills on Database and DB products to advocate to customer the use cases and guide the customer and team of Oracle CSS Engineers through the lifecycle of Oracle Technology product adoption Manage the contract or delivery engagement as defined by ACS line management, including creating and maintaining accurate documentation Maintain the Oracle business systems to ensure systems are up to date with the correct/current information (resource assignment, timecards, rates, completion estimates, invoice details etc.) to ensure that services are delivered efficiently, invoices are generated in a timely manner and revenues are recognised promptly. Plan and deploy resources to ensure effective delivery within agreed budgetary constraints. Where appropriate create and maintain the ACS service delivery or project plan. Actively manage project forecast, identify risks and issues and opportunity for revenue collection (upside) Accountabilities: Proactively manage the contract delivery to completion / customer acceptance Proactively report on any potential risks / issues that may impact service delivery or customer satisfaction Manage any customer escalation that may arise Ensure all contract-related systems and documentation either required contractually or as part of a program, are up to date and accurate Monitor and report revenue forecast and margin estimates, revenue and margin achievements for each contract Work in line with customer working practices and procedures, if contractually agreed Operate in line with Oracle CSS business processes and procedures Operate in line with Oracle Global and local HR policies and procedures About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

10.0 - 13.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Organization: At CommBank, we never lose sight of the role we play in other people’s financial wellbeing. Our focus is to help people and businesses move forward to progress. To make the right financial decisions and achieve their dreams, targets, and aspirations. Regardless of where you work within our organisation, your initiative, talent, ideas, and energy all contribute to the impact that we can make with our work. Together we can achieve great things. Job Title: Sr Data Engineering Location: Bangalore Business & Team: Technology Team is responsible for the world leading application of technology and operations across every aspect of CommBank, from innovative product platforms for our customers to essential tools within our business. We also use technology to drive efficient and timely processing, an essential component of great customer service. CommBank is recognised as leading the industry in IT and operations with its world-class platforms and processes, agile IT infrastructure, and innovation in everything from payments to internet banking and mobile apps. The Group Security (GS) team protects the Bank and our customers from cyber compromise, through proactive management of cyber security, privacy, and operational risk. Our team includes: Cyber Strategy & Performance Cyber Security Centre Cyber Protection & Design Cyber Delivery Cyber Data Engineering Cyber Data Security Identity & Access Technology The Group Security Senior Data Engineering team provides specialised data services and platforms for the CommBank group & is accountable for developing Group’s data strategy, data policy & standards, governance and set requirements for data enablers/tools. The team is also accountable to facilitate a community of practitioners to share best practice and build data talent and capabilities. Impact & contribution :- To ensure the Group achieves a sustainable competitive advantage through data engineering, you will play a key role in supporting and executing the Group's data strategy. We are looking for an experienced Data Engineer to join our Group Security Team, which is part of the wider Cyber Security Engineering practice. In this role, you will be responsible for setting up the Group Security Data Platform to ingest data from various organizations' security telemetry data, along with additional data assets and data products. This platform will provide security controls and services leveraged across the Group. Roles & Responsibilities You will be expected to perform the following tasks in a manner consistent with CBA’s Values and People Capabilities. CORE RESPONSIBILITIES: Possesses hands-on technical experience working in AWS. The individual should have knowledge about AWS services like EC2, S3, Lambda, Athena, Kinesis, Redshift, Glue, EMR, DynamoDB, IAM, SecretManager, KMS, Step functions, SQS,SNS, Cloud Watch. The individual should possess a robust set of technical and soft skills and be an excellent AWS Data Engineer with a focus on complex Automation and Engineering Framework development. Being well-versed in Python is mandatory, and experience in developing complex frameworks using Python is required. Passionate about Cloud/DevSecOps/Automation and possess a keen interest in solving complex problems systematically. Drive the development and implementation of scalable data solutions and data pipelines using various AWS services. Possess the ability to work independently and collaborate closely with team members and technology leads. Exhibit a proactive approach, constantly seeking innovative solutions to complex technical challenges. Can take responsibility for nominated technical assets related to areas of expertise, including roadmaps and technical direction. Can own and develop technical strategy, overseeing medium to complex engineering initiatives. Essential Skills:- About 10-13 years of experience as a Data Engineering professional in a data-intensive environment. The individual should have strong analytical and reasoning skills in the relevant area. Proficiency in AWS cloud services, specifically EC2, S3, Lambda, Athena, Kinesis, Redshift, Glue, EMR, DynamoDB, IAM, SecretManager, Step functions, SQS,SNS, Cloud Watch. Excellent skills in Python-based framework development are mandatory. Proficiency in SQL for efficient querying, managing databases, handling complex queries, and optimizing query performance. Excellent automation skills are expected in areas such as Automating the testing framework using tools such as PyPy, Pytest, and various test cases including unit, integration, functional tests, and mockups. Automating the data pipeline and expediting tasks such as data ingestion and transformation. API-based automated and integrated calls(REST, cURL, authentication & authorization, tokens, pagination, openApi, Swagger) Implementing advanced engineering techniques and handling ad hoc requests to automate processes on demand. Implementing automated and secured file transfer protocols like XCOM, FTP, SFTP, and HTTP/S Experience with Terraform, Jenkins, Teracity and Artifactory is essential as part of DevOps. Additionally, Docker and Kubernetes are also considered. Proficiency in building orchestration workflows using Apache Airflow. Strong understanding of streaming data processing concepts, including event-driven architectures. Familiarity with CI/CD pipeline development, such as Jenkins. Extensive experience and understanding in Data Modelling, SCD Types, Data Warehousing, and ETL processes. Excellent experience with GitHub or any preferred version control systems. Expertise in data pipeline development using various data formats/types. Mandatory knowledge and experience in big data processing using PySpark/Spark and performance optimizations of applications Proficiency in handling various file formats (CSV, JSON, XML, Parquet, Avro, and ORC) and automating processes in the big data environment. Ability to use Linux/Unix environments for development and testing. Should be aware of security best practices to protect data and infrastructure, including encryption, tokenization, masking, firewalls, and security zones. Well-structured documentation skills and the ability to create a well-defined knowledge base. Certifications such as AWS Certified Data Analytics/Engineer/Developer – Specialty or AWS Certified Solutions Architect. Should be able to perform extreme engineering and design a robust, efficient, and cost-effective data engineering pipelines which are highly available and dynamically scalable on demand. Enable the systems to effectively respond to high demands and heavy loads maintaining the high throughput and high I/O performance with no data loss Own and lead E2E Data engineering life cycle right from Requirement gathering, design, develop, test, deliver and support as part of DevSecOPS process. Must demonstrate skills and mindset to implement encryption methodologies like SSL/TLS and data encryption at rest and in transit and other data security best practices Hands on work experience with data design tools like Erwin and demonstrate the capabilities of building data models, data warehouse, data lakes, data assets and data products Must be able to constructively challenge the status quo and lead to establish data governance, metadata management, ask the right questions, design with right principles Education Qualification :- A Bachelor's or Master's degree in Engineering, specializing in Computer Science, Information Technology or relevant qualifications. If you're already part of the Commonwealth Bank Group (including Bankwest, x15ventures), you'll need to apply through Sidekick to submit a valid application. We’re keen to support you with the next step in your career. We're aware of some accessibility issues on this site, particularly for screen reader users. We want to make finding your dream job as easy as possible, so if you require additional support please contact HR Direct on 1800 989 696. Advertising End Date: 29/06/2025 Show more Show less

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Designation Stores Executive Qualification: Dip (ECE/EEE) or Any Arts & Science Degree with 3 - 5 yrs of experience in Stores Job Responsibilities SFO Sticker preparation SFO Material pullout Upload the date / lot details in PHP MRR Issue, SMS / Email Trigger and PHP Updation Physical Inventory count Vacuum Sealing of Incoming PCB's Inward Materials - Binning Sticker Pasting Inward Materials - Binning at Location Shortage List Preparation Verification of pulled flag MRV Materials Binning Sticker Pasting MRV Materials Binning at Location Closed R-MRR Materials Binning Sticker Pasting Closed R-MRR Materials Binning at location Existing stock RM QR Code Data Collection Existing stock RM QR Code Sticker Pasting ERP 1st level entry Collection of Materials from INQC, Verification and Storage in appropriate rack Rejection Disposal & Tolling material courier (Gatepass and Statutory document preparation) Customer Property entry in portal Rejection re-inward Rejection material updation in portal and move to location Immediate Issue of Inward Material (before binning) Rejection & referrals portal updation Handover of materials to PDN-Mech for Masking Experience 3 - 5 Years Industry Type Electronics Design and Manufacturing Functional Area Executive Location Chennai Email jobs@datapatterns.co.in Address Data Patterns (India) Limited, Plot.No H9, 4th Main Road SIPCOT IT Park Off Rajiv Gandhi Salai (OMR) Siruseri Chennai - 603 103 Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Kerala, India

Remote

Job Description Our customers will measure our contribution to their success based on the value they receive from our services. TAMs are responsible for the overall governance and technical service delivery. They help customers maximize the business value of their Oracle investments, achieving the desired business outcomes while minimizing risk. To do this, TAMs must become trusted advisors to the customer, ensure consistency and quality of deliverables, help customers deliver their IT strategy, overcome challenges and meet business goals, and use leading practices for successful Oracle technology and Cloud deployments and operations. The Services Portfolio includes Managed Services, On-Premise, Hybrid Cloud, Applications, Platforms and Databases (SaaS/PaaS/IaaS), and Security services that TAMs may manage in full or in part. Description The candidate must have strong troubleshooting skills on Database and Database technology products Expertise in Performance issue analysis and providing resolution Guide customer on Oracle Database Best practices Should possess knowledge on implementation and supporting on Database Security Products like Transparent Data Encryption, Redaction, Data Vault, Masking. Possess strong troubleshooting skills on Real Application Cluster Should be able to guide and mentor team of engineers on Database Technology products Should possess knowledge and be able to articulate to customer the use cases of Advanced Compression, In-memory Knowledge on Oracle Enterprise Manager Personal Skills Strong experience in service delivery and/or project management is required. Oracle products and services knowledge will be highly appreciated as well as experience in Oracle HW platforms and OS. Experience on Enterprise Customers is required Excellent communication / relationship building skills Customer focused and results oriented Ability to work under pressure in highly escalated situations Organized with strong attention to detail Decision making / problem solving skills Ability to manage multiple concurrent activities (customer engagements) Highly professional: Ability to deal with senior and exec stakeholders with confidence Strong analytic skills and ability to pre-empt potential risks and issues Database Administrator Responsibilities Experience with Oracle Real Application Clusters (RAC), Data Guard, ASM and RMAN. Monitor, analyze and optimize database performance to ensure high availability and reliability in a RAC Environment. Implementing DR solutions using Oracle Standby Database using Oracle Data Guard. Monitor data backup process and perform data recovery if needs be. Perform database tuning, including SQL query optimization, indexing strategies and resource management. Expert in analyzing Explain Plans, AWR reports, OEM, and other diagnostic tools to identify potential performance bottlenecks. Manage Oracle database instances (installation, configuration, upgrades, and patching). Identify bottlenecks in database systems and propose effective solutions. Cross platform migration and Oracle Datapump utilities. Collaborate with development teams to review and optimize SQL code and schema design. Qualifications: Education: Bachelor’s degree in computer science, IT, or related field. Oracle Database certifications (e.g., OCA, OCP) are preferred. Experience: 8-12 years of hands-on experience with Oracle Database administration and support. Shift : 24*7 Shift working from Client Site @ Kerala. No remote or WFH allowed. Technical Skills: In-depth knowledge of Oracle Database architecture and internals. Proficiency in SQL, PL/SQL, and database performance tuning. Experience with Oracle Real Application Clusters (RAC), Data Guard, ASM and RMAN. Basic knowledge of Cloud (OCI, AWS or AZURE) Experience with Oracle Enterprise Manager (OEM) and monitoring tools. Understanding of database security principles (encryption, user management). Career Level - IC4 Responsibilities Develop and Manage the Oracle Customer Relationships by forming long term customer relationships with key customer contacts. Work is non-routine and complex, involving the application of advanced technical/business skills in area of specialization. Provides direction and mentoring to more junior team members. Understand customer’s industry drivers, organization structure and key stakeholders, key projects and goals, and critical success factors as well as technical infrastructure and roadmap. Work collaboratively with sales, the delivery teams and customers to identify appropriate solutions. Coordinate delivery of Oracle Services, operating as the primary delivery contact to the customer, aiding and facilitating customer communications and activities across other Oracle lines of business. Responsible for delivering to the contracted terms, effective and efficient use of Oracle delivery resources, achieving the contract margin and revenue objectives. Identify and submit delivery leads for new opportunities and contract renewals. Act as a point of contact for any major incidents, responsible for managing communication and customer expectations through resolution. Establish and maintain a delivery governance model with the customer at the management and executive levels. Perform scope and risk management. Contribute to initiatives for Oracle delivery organizational process improvement and tool development. Conduct periodic Service Account Planning and Account Reviews. Be the single point of contact within Oracle for the customer, acting as their advocate for the service you are responsible for delivering. The CSS TAM is a customer advocate and must demonstrate customer obsession by placing the client needs first. Provide technical guidance and be part of the customer calls/meeting on adoption of database technology Should possess strong technical skills on Database and DB products to advocate to customer the use cases and guide the customer and team of Oracle CSS Engineers through the lifecycle of Oracle Technology product adoption Manage the contract or delivery engagement as defined by ACS line management, including creating and maintaining accurate documentation Maintain the Oracle business systems to ensure systems are up to date with the correct/current information (resource assignment, timecards, rates, completion estimates, invoice details etc.) to ensure that services are delivered efficiently, invoices are generated in a timely manner and revenues are recognised promptly. Plan and deploy resources to ensure effective delivery within agreed budgetary constraints. Where appropriate create and maintain the ACS service delivery or project plan. Actively manage project forecast, identify risks and issues and opportunity for revenue collection (upside) Accountabilities: Proactively manage the contract delivery to completion / customer acceptance Proactively report on any potential risks / issues that may impact service delivery or customer satisfaction Manage any customer escalation that may arise Ensure all contract-related systems and documentation either required contractually or as part of a program, are up to date and accurate Monitor and report revenue forecast and margin estimates, revenue and margin achievements for each contract Work in line with customer working practices and procedures, if contractually agreed Operate in line with Oracle CSS business processes and procedures Operate in line with Oracle Global and local HR policies and procedures About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team Deloitte helps organizations prevent cyberattacks and protect valuable assets. We believe in being secure, vigilant, and resilient—not only by looking at how to prevent and respond to attacks, but at how to manage cyber risk in a way that allows you to unleash new opportunities. Embed cyber risk at the start of strategy development for more effective management of information and technology risks Your work profile As Assistant Manager in our Cyber Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - We are looking for a skilled Cribl Data Engineer to design, manage, and optimize data pipelines that process and route machine data at scale. The ideal candidate will have hands-on experience with Cribl Stream , Cribl Edge , or Cribl Search , and a strong understanding of telemetry data workflows, observability tools, and data platforms like Splunk, Sentinel, Elastic, or S3. Design and build streaming data pipelines using Cribl Stream for routing, transforming, and enriching logs, metrics, and trace data. Configure data sources (e.g., Syslog, HEC, TCP, S3, Kafka) and destinations (e.g., Splunk, Sentinel, Elasticsearch, Data Lakes). Develop pipelines, routes, packs, and knowledge objects using Cribl’s UI and scripting features. Optimize data ingestion workflows to reduce costs, improve performance, and enhance data usability. Implement filtering, masking, sampling, and transformation logic using Cribl Functions (Regex, Eval, Lookup, JSON, etc.). Work with SIEM and observability teams to ensure clean, enriched, and correctly formatted data flows into tools like Splunk, Sentinel, S3, or OpenSearch. Monitor Cribl infrastructure and debug pipeline issues in real time using Cribl Monitoring and Health Checks. Implement version control, testing, and CI/CD for Cribl pipelines (using GitHub or GitLab). Participate in PoC evaluations, vendor integrations, and best practices documentation.\ Desired qualifications Education: Bachelor’s degree in Information Security, Computer Science, or a related field. A Master’s degree in Cybersecurity or Business Management is preferred. Experience: 3 to 5 Year Hands-on experience with Cribl Stream and knowledge of Cribl Edge or Cribl Search. Strong understanding of log formats (Syslog, JSON, CSV, Windows Event Logs, etc.) Familiarity with SIEM platforms like Splunk, Microsoft Sentinel, Elastic Stack, QRadar, or Exabeam. Proficient in regex, JSON transformations, and scripting logic. Comfortable with cloud platforms (AWS/Azure/GCP) and object storage systems (e.g., S3, Azure Blob). Familiarity with Kafka, Fluentd, Fluent Bit, Logstash, or similar tools is a plus. Location and way of working Base location: Noida/Gurgaon Professional is required to work from office. Your role as a Assistant Manager We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Senior Executive across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation Committed to creating purpose - Creating a sense of vision and purpose Agile - Achieving high-quality results through collaboration and Team unity Skilled at building diverse capability - Developing diverse capabilities for the future Persuasive / Influencing - Persuading and influencing stakeholders Collaborating - Partnering to build new solutions Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization Effective communication – Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyone's valued for who they are, use empathy to understand others to adapt our behaviors and attitudes to become more inclusive. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. *Caution against fraudulent job offers*: We would like to advise career aspirants to exercise caution against fraudulent job offers or unscrupulous practices. At Deloitte, ethics and integrity are fundamental and not negotiable. We do not charge any fee or seek any deposits, advance, or money from any career aspirant in relation to our recruitment process. We have not authorized any party or person to collect any money from career aspirants in any form whatsoever for promises of getting jobs in Deloitte or for being considered against roles in Deloitte. We follow a professional recruitment process, provide a fair opportunity to eligible applicants and consider candidates only on merit. No one other than an authorized official of Deloitte is permitted to offer or confirm any job offer from Deloitte. We advise career aspirants to exercise caution. Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Company Overview At Motorola Solutions, we believe that everything starts with our people. We’re a global close-knit community, united by the relentless pursuit to help keep people safer everywhere. Our critical communications, video security and command center technologies support public safety agencies and enterprises alike, enabling the coordination that’s critical for safer communities, safer schools, safer hospitals and safer businesses. Connect with a career that matters, and help us build a safer future. Department Overview Our IT organization has a critical role in driving extraordinary business results. Through a strong partnership with other areas of our business, we bring innovative thinking to every conversation and deliver with integrity. We’re looking for people who bring great ideas and who make our partners’ ideas better. Intellectually curious advisors (not order takers) who focus on outcomes to creatively solve business problems. People who not only embrace change, but who accelerate it. Job Description Scope of Responsibilities/Expectations Install, configure and administer databases, including and not limited to, Oracle, MS SQL, and Open source technologies like MySQL, MongoDB, Postgresql, etc. hosted in AWS, OCI, Azure, and On-Premises infrastructure. Provide technical leadership on database systems and solutions. Provide individual mentoring and training on databases and tools. Recommend, test, and evaluate new technologies, software tools, and required skill sets Setup, maintain and monitor databases in the development, test, and production environments. Provide support per project needs to investigate and troubleshoot issues; perform regular log analysis, security monitoring to identify any possible intrusions Perform ongoing performance tuning, technology upgrades, and system resource optimization Develop tools and automation processes to improve infrastructure and database operations Creating concepts and driving database migration/upgrade projects Develop prototypes and proof of concept solutions Support developers in development-oriented activities Prepare effort estimates, detailed execution plans, delivery schedules Liaisons with the Infrastructure team, Application team, Architecture team, third parties, as necessary Develop & maintain documentation for assigned projects, including install guide, troubleshooting guide, SOP, etc Specific Knowledge/Skills Technical Skills: Experience in Oracle Database Administration 10g/11g/12C,18C,19C, MS SQL Server, Open source database technologies like MySQL, MongoDB, PostgreSQL Experience in database administration in cloud infrastructure (AWS/OCI/Azure) Hands-on database administration skills. Hands-on experience with Oracle Applications installation and maintenance. Experience installing and configuring Real Application Clusters / Maximum Availability Architecture (MAA) solutions or clustering technologies to address high availability requirements. Experience in performing backup and restore. Experience with advanced security options: Data Encryption & Data masking, Database vault & Audit vault Experience with Disaster Recovery solutions – Logical/Physical standby, Data guard or storage level replication Knowledge of Relational Database Management Systems (RDBMS) concepts such as SQL, stored procedures, JDBC/ODBC drivers, tables, foreign keys, joins, normalization, etc. UNIX knowledge with experience writing shell scripts and creating schedule jobs on UNIX/Linux/Windows. Experience in standard operating procedures to perform pre and post-production support activities. Experience in troubleshooting and documenting issues, gaps, and resolutions. Able to provide incident management expert-level support to the Operations team as needed. Soft Skills: Manage project task execution independently and get all associated team members to deliver their tasks on time, both with and without direct authority. Self-motivated, work independently or as part of a team, learn quickly, meet deadlines and demonstrate problem-solving skills. Highly organized and thorough, with an ability to facilitate communication and scheduling among different levels of staff Ability to perform under pressure, handle interruptions and changes without losing productivity and plan assignments, and monitor performance according to priorities as demonstrated by regularly meeting established deadlines Ability to provide consistent follow-through with the Team Manager and IT Project manager on issues/concerns to ensure appropriate visibility and escalation where needed. Excellent verbal and written communication skills Excellent interpersonal skills with the ability to interact with others in a consistently positive manner, exercising discretion and tact Excellent written communication skills with the ability to draft clear, concise specifications, documentation, and reports Basic Requirements Must have at least 10 years of professional work experience in IT Must have at least 6 years of hands-on experience in administering Database Technology Must have at least 2 years of hands-on experience in Cloud administration Having Industry recognized certification (like OCP, OCM, AWS, or OCI Solution Architecture ) is a plus We offer: Competitive salary package Strong team-oriented culture Flexible working hours Private medical coverage Life insurance Employee Stock Plan Access to wellness facilities and integration events Motorola Solutions is supporting CSR activities and encourages employees to participate Travel Requirements None Relocation Provided None Position Type Experienced Referral Payment Plan Yes EEO Statement Motorola Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion or belief, sex, sexual orientation, gender identity, national origin, disability, veteran status or any other legally-protected characteristic. We are proud of our people-first and community-focused culture, empowering every Motorolan to be their most authentic self and to do their best work to deliver on the promise of a safer world. If you’d like to join our team but feel that you don’t quite meet all of the preferred skills, we’d still love to hear why you think you’d be a great addition to our team. We’re committed to providing an inclusive and accessible recruiting experience for candidates with disabilities, or other physical or mental health conditions. To request an accommodation, please email ohr@motorolasolutions.com. Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies