Jobs
Interviews

563 Masking Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 11.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 50,000 employees in nearly 30 countries, is recognised for its consulting, digital services and solutions. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organisations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2024, the Group generated revenues of €5.8 billion. Job Description The world is how we shape it. Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development is good to have. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. Qualifications BTech/MCA Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities. Show more Show less

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description The Opportunity: Full Stack Data Engineer We're seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform. In this role, you'll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment. You'll work with GCP Native technologies like BigQuery, Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance. This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at Ford. Responsibilities What You'll Do: ( Responsibilities) Data Pipeline Architect & Builder: Spearhead the design, development, and maintenance of scalable data ingestion and curation pipelines from diverse sources. Ensure data is standardized, high-quality, and optimized for analytical use. Leverage cutting-edge tools and technologies, including Python, SQL, and DBT/Dataform, to build robust and efficient data pipelines. End-to-End Integration Expert: Utilize your full-stack skills to contribute to seamless end-to-end development, ensuring smooth and reliable data flow from source to insight. GCP Data Solutions Leader : Leverage your deep expertise in GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that not only meet but exceed business needs and expectations. Data Governance & Security Champion : Implement and manage robust data governance policies, access controls, and security best practices, fully utilizing GCP's native security features to protect sensitive data. Data Workflow Orchestrator : Employ Astronomer and Terraform for efficient data workflow management and cloud infrastructure provisioning, championing best practices in Infrastructure as Code (IaC). Performance Optimization Driver : Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions, ensuring optimal resource utilization and cost-effectiveness. Collaborative Innovator : Collaborate effectively with data architects, application architects, service owners, and cross-functional teams to define and promote best practices, design patterns, and frameworks for cloud data engineering. Automation & Reliability Advocate : Proactively automate data platform processes to enhance reliability, improve data quality, minimize manual intervention, and drive operational efficiency. Effective Communicator : Clearly and transparently communicate complex technical decisions to both technical and non-technical stakeholders, fostering understanding and alignment. Continuous Learner : Stay ahead of the curve by continuously learning about industry trends and emerging technologies, proactively identifying opportunities to improve our data platform and enhance our capabilities. Business Impact Translator : Translate complex business requirements into optimized data asset designs and efficient code, ensuring that our data solutions directly contribute to business goals. Documentation & Knowledge Sharer : Develop comprehensive documentation for data engineering processes, promoting knowledge sharing, facilitating collaboration, and ensuring long-term system maintainability. Qualifications What You'll Bring: (Qualifications) Bachelor's degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field (or equivalent combination of education and experience). 5-7 years of experience in Data Engineering or Software Engineering, with at least 2 years of hands-on experience building and deploying cloud-based data platforms (GCP preferred). Strong proficiency in SQL, Java, and Python, with practical experience in designing and deploying cloud-based data pipelines using GCP services like BigQuery, Dataflow, and DataProc. Solid understanding of Service-Oriented Architecture (SOA) and microservices, and their application within a cloud data platform. Experience with relational databases (e.g., PostgreSQL, MySQL), NoSQL databases, and columnar databases (e.g., BigQuery). Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments. Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform and Tekton, and other automation frameworks. Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data platform and microservices issues. Experience in monitoring and optimizing cost and compute resources for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc). A passion for data, innovation, and continuous learning. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Gurgaon

On-site

202408432 Gurugram, Haryana, India Thane, Maharashtra, India Bevorzugt Description The Role: Partner with other architecture resources to lead the end-to-end architecture of the health and benefits data platform using Azure services, ensuring scalability, flexibility, and reliability. Develop broad understanding of the data lake architecture, including the impact of changes on a whole system, the onboarding of clients and the security implications of the solution. Design a new or improve upon existing architecture including data ingestion, storage, transformation and consumption layers. Define data models, schemas, and database structures optimized for H&B use cases including claims, census, placement, broking and finance sources. Designing solutions for seamless integration of diverse health and benefits data sources. Implement data governance and security best practices in compliance with industry standards and regulations using Microsoft Purview. Evaluate data lake architecture to understand how technical decisions may impact business outcomes and suggest new solutions/technologies that better align to the Health and Benefits Data strategy. Draw on internal and external practices to establish data lake architecture best practices and standards within the team and ensure that they are shared and understood. Continuously develop technical knowledge and be recognised as a key resource across the global team. Collaborate with other specialists and/or technical experts to ensure H&B Data Platform is delivering to the highest possible standards and that solutions support stakeholder needs and business requirements. Initiate practices that will increase code quality, performance and security. Develop recommendations for continuous improvements initiatives, applying deep subject matter knowledge to provide guidance at all levels on the potential implications of changes. Build the team’s technical expertise/capabilities/skills through the delivery of regular feedback, knowledge sharing, and coaching. High learning adaptability, demonstrating understanding of the implications of technical issues on business requirements and / or operations. Analyze existing data design and suggest improvements that promote performance, stability and interoperability. Work with product management and business subject matter experts to translate business requirements into good data lake design. Maintain the governance model on the data lake architecture through training, design reviews, code reviews, and progress reviews. Participate in the development of Data lake Architecture and Roadmaps in support of business strategies Communication with key stakeholders and development teams on technical solutions. Convince and present proposals by way of high-level solutions to end users and/or stakeholders. The Requirement: Candidate must have significant experience in a technology related discipline, such as IT or Engineering with a Bachelor's/College Degree in these areas being beneficial. Strong experience in databases, tools and methodologies Strong skills across a broad range of database technologies including Azure Data Factory, Azure Synapse Analytics, Azure Databricks, Azure SQL Database, Azure Data Lake Storage, and other Azure Services. Working knowledge of Microsoft Fabric is preferred. Data Analysis, Data Modeling, Data Integration, Data Warehousing, Database Design Experience with database performance evaluation and remediation Develop strategies for data acquisitions, archive recovery and implementation Be able to design and develop Databases, Data Warehouses and Multidimensional Databases Experience in Data Governance including Microsoft Purview, Azure Data Catalogue, Azure Data Share, and other Azure tools. Familiarity with legal risks related to data usage and rights. Experience in data security, including Azure Key Vault, Azure Data Encryption, Azure Data Masking, Azure Data Anonymization, and Azure Active Directory. Ability to develop database strategies for flexible high-performance reporting and business intelligence Experience using data modeling tools & methodology Experience working within an Agile Scrum Development Life Cycle, across varying levels of Agile maturity Experience working with geographically distributed scrum teams Excellent verbal and writing skills, including the ability to research, design, and write new documentation, as well as to maintain and improve existing material Technical competencies: Subject Matter Expertise Developing expertise You strengthen your depth and/or breadth of subject matter knowledge and skills across multiple areas. You define the expertise required in your area based on emerging technologies, industry practices. You build the team’s capability accordingly. Applying expertise You apply subject matter knowledge and skills across multiple areas to assess the impact of complex issues and implement long-term solutions. You foster innovation using subject matter knowledge to enhance tools, practices, and processes for the team. Solution Development Systems thinking You lead and foster collaboration across H&B Data Platform Technology to develop solutions to complex issues. You apply a whole systems approach to evaluating impact, and take ownership for ensuring links between structure, people and processes are made. Focusing on quality You instill a quality mindset to the team and ensure the appropriate methods, processes and standards are in place for teams to deliver quality solutions. You create and deliver improvement initiatives. Technical Communication Simplifying complexity You develop tools, aids and/or original content to support the delivery and/or understanding of complex information. You guide others on best practice. Qualifications Candidate must have significant experience in a technology related discipline, such as IT or Engineering with a Bachelor's/College Degree in these areas being beneficial.

Posted 1 month ago

Apply

8.0 years

1 Lacs

Gurgaon

On-site

To be responsible for the overall direction, coordination, implementation, execution, control and completion of projects ensuring quality and on-time delivery of project. Role will be based out of Gurgaon, with frequent travel to site locations (entire duration of project assigned) Roles & Responsibilities: Drawings and BoQ Understanding. Ability to read layout drawings and detailed sectional drawings and elevation drawings. Understanding and analyzing BoQ. Proficient with on-site measurement and demarcating changes from layout/ BoQ Knowledge of various materials used in interior fit-out projects. Client Management & Servicing. Strong client communication on-site. Ensuring the client is up to date with project snags and potential roadblocks during on-site visits. Project Management. Creation and follow-through of purchase plan. Material tracking and stacking, ensuring <5% material wastage. Timeline management and tracking for update to stakeholders. Management of team of contractors, responsible for onboarding new high-quality contractors. Labour management, ensuring discipline, attendance, and allocation of work Regular quality checks, ensuring adherence to drawings, BoQ, and client instruction; on-site. Quality monitoring to ensure no SNAGs policy and ensuring use of masking tape, floor covering, etc. Must ensure timely arrival of materials at site and regular site cleaning Leadership Manages team allocation on projects, on-rolling of the execution team, including 3rd party contractors; must ensure timely arrival of team on site every day. In charge of ensuring that a project overview meeting to understand layout, BoQ, Electricals,cleaning plan, and purchase plan with project manager and electrical head. Must attend a 15-minute weekly stand-up to discuss site progress, plan for next week, and to clear roadblocks for the site supervisor. Must be up to date with all site activity, informing project managers of updates and issues, and ensuring the communication of roadblocks for the projects team. Required to conduct training for site team and provide feedback to team post every project Required Skills. Understanding of Layouts, BoQ, and electricals. Strong verbal communication skills in English and/ or Hindi. Should have experience managing on-site labour team. Working knowledge of computers (MS Office) Qualification. Graduate with working knowledge of computers OR Diploma/B.Tech/B.E, preferably in Civil or related field. Candidate must have the ability to work independently and have experience with managing a cross-functional team. 8+ years of experience as a site manager for a commercial/ corporate interior fit-out company. Job Types: Full-time, Permanent Pay: Up to ₹100,000.00 per month Benefits: Cell phone reimbursement Health insurance Leave encashment Paid sick time Provident Fund Schedule: Day shift Fixed shift Work Location: In person

Posted 1 month ago

Apply

1.0 years

0 - 0 Lacs

Mohali

On-site

We are seeking a creative and detail-oriented Video Editor & Motion Graphics Designer with expertise in Adobe After Effects, Premiere Pro, Photoshop, and Illustrator . The ideal candidate will be responsible for editing compelling videos and creating high-quality animations, promotional content, reels, explainer videos, and brand assets for digital and social platforms. Key Responsibilities: Edit videos using Adobe Premiere Pro – including syncing audio, color correction, transitions, and effects. Create engaging motion graphics and animations in After Effects (e.g., logo animations, text animation, character motion). Design visual assets such as thumbnails, social media posts, and posters using Photoshop & Illustrator . Collaborate with the creative and marketing teams to conceptualize and execute video projects. Manage multiple projects and meet tight deadlines without compromising quality. Stay updated with current video trends and social media formats (Reels, Shorts, YouTube intros, etc.). Required Skills: Proficient in Adobe After Effects , Premiere Pro , Photoshop , and Illustrator . Strong understanding of storytelling, pacing, sound design, and visual effects. Knowledge of color grading, motion tracking, masking, and keyframing. Basic knowledge of typography and layout design for motion graphics. Bonus: Experience with audio editing, 3D animation (Blender/Cinema 4D), or video scripting. Preferred Qualifications: Bachelor's degree or diploma in Film, Design, Animation, or related field. 1+ year of professional experience or strong portfolio. Understanding of social media platforms and formats (Instagram, YouTube, etc.). Benefits: Competitive salary / project-based compensation Creative work environment Opportunities to work on music videos, advertisements, films, and social content Growth and learning opportunities in a digital-first company Job Types: Full-time, Internship Contract length: 15000 months Pay: ₹10,385.41 - ₹15,000.59 per month Schedule: Day shift Supplemental Pay: Performance bonus Yearly bonus Work Location: In person Expected Start Date: 01/07/2025

Posted 1 month ago

Apply

4.0 - 5.0 years

5 - 18 Lacs

Noida

On-site

We are seeking a skilled *Data Masking Engineer* with 4-5 years of experience in *SQL Server* and *Redgate tools* to design, implement, and manage data masking solutions. The ideal candidate will ensure sensitive data is protected while maintaining database usability for development, testing, and analytics. The candidate should be ready to relocate to Johannesburg South Africa at the earliest possible. Responsibilities Design and implement *data masking strategies* for SQL Server databases to comply with security and privacy regulations (GDPR, HIPAA, etc.). - Use *Redgate Data Masker* and other tools to anonymize sensitive data while preserving referential integrity. - Develop and maintain masking rules, scripts, and automation workflows for efficient data obfuscation. - Collaborate with *DBAs, developers, and security teams* to identify sensitive data fields and define masking policies. - Validate masked data to ensure consistency, usability, and compliance with business requirements. - Troubleshoot and optimize masking processes to minimize performance impact on production and non-production environments. - Document masking procedures, policies, and best practices for internal teams. - Stay updated with *Redgate tool updates, SQL Server features, and data security trends*. Qualifications: 4-5 years of hands-on experience in SQL Server database development/administration . - Strong expertise in *Redgate Data Masker* or similar data masking tools (e.g., Delphix, Informatica). - Proficiency in *T-SQL, PowerShell, or Python* for scripting and automation. - Knowledge of *data privacy laws (GDPR, CCPA)* and secure data handling practices. - Experience with *SQL Server security features* (Dynamic Data Masking, Always Encrypted, etc.) is a plus. - Familiarity with *DevOps/CI-CD pipelines* for automated masking in development/test environments. - Strong analytical skills to ensure masked data remains realistic for testing. Preferred Qualifications: - Redgate or Microsoft SQL Server certifications. - Experience with *SQL Server Integration Services (SSIS)* or ETL processes. - Knowledge of *cloud databases (Azure SQL, AWS RDS)* and their masking solutions. Job Type: Full-time Pay: ₹586,118.08 - ₹1,894,567.99 per year Schedule: Day shift Work Location: In person

Posted 1 month ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Hello, Greetings from ZettaMine!! 📌 Job Title: Storage Administrator – L1 / L2 / L3 Location: Hyderabad / Navi Mumbai Employment Type: Full-Time – Direct Payroll Joining: Immediate Joiners Preferred 🔹 L1 Storage Administrator – JD Experience: 0–2 Years Role Type: Entry-Level / Monitoring & Basic Support Key Responsibilities: Monitor SAN/NAS environments and raise alerts as needed. Perform basic troubleshooting and escalate unresolved issues to L2/L3 teams. Assist with daily health checks of storage devices and backup systems. Log incidents and service requests in ticketing systems (e.g., ServiceNow). Provide support in executing standard operational procedures. Skills: Basic understanding of storage systems (NetApp, Dell EMC, HPE). Familiarity with ITSM tools and ticketing workflows. Good communication and willingness to learn. Flexible with 24/7 rotational shifts. 🔹 L2 Storage Administrator – JD Experience: 2–5 Years Role Type: Mid-Level / Hands-on Operations & Troubleshooting Key Responsibilities: Perform storage provisioning, zoning, masking, and replication tasks. Manage backup and restore operations using tools like Veritas, Commvault, or NetBackup. Conduct performance tuning and space management. Troubleshoot moderately complex storage-related issues. Work with L1 team to guide and resolve tickets, and escalate to L3 when needed. Create and maintain storage documentation. Skills: Hands-on experience with SAN/NAS storage (NetApp, EMC, IBM, HPE). Good understanding of Fibre Channel, iSCSI, RAID, LUN management. Experience with backup software and snapshot technologies. Basic scripting (PowerShell, Shell, etc.) is a plus. Strong understanding of ITIL practices. 🔹 L3 Storage Administrator – JD Experience: 5+ Years Role Type: Expert / Design, Escalation & Strategy Key Responsibilities: Design and implement enterprise-grade storage solutions. Perform root cause analysis on complex incidents and provide permanent fixes. Lead capacity planning and performance optimization efforts. Work with cross-functional teams to support business continuity (DR/BCP). Oversee firmware updates, storage migrations, and automation initiatives. Mentor L1/L2 teams and develop knowledge base articles. Skills: Deep expertise in multiple storage platforms (e.g., Dell EMC VMAX/Unity, NetApp ONTAP, IBM, Hitachi). Experience in automation (Python, PowerShell) and orchestration tools. Knowledge of storage integration with virtualization platforms (VMware, Hyper-V). Strong project management and documentation skills. Excellent troubleshooting and client interaction skills Interested candidates can reach on md.afreen@zettamine.com Thanks & Regards Afreen Show more Show less

Posted 1 month ago

Apply

1.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Details: Job Description Minimum Qualification - Graduate (B.Tech) Relevant Experience - 1+ years As a System Administrator, you will have responsibilities to provide the Level 1 Technical Support. You will be the first point of contact for all infrastructure monitoring incidents and requests in a fast-paced professional environment. You will monitor, create & triage incidents as per ITIL framework. Good to have troubleshooting skills on Windows Server 2012/2019. Create documentation & SOP for easing out tasks & resolve Infrastructure related issues. You will work in a timely and efficient manner while ensuring attendance, quality and customer service metrics are met. Should have excellent written and verbal communication Good Exposure to ticketing tools (ServiceNow, JIRA, Cherwell etc.) Good troubleshooting skills on Windows/Linux OS Basic understanding on Networking and Network devices. Excellent understanding of Infrastructure/Application/Job monitoring Good exposure on monitoring tools (Zabbix/Dynatrace/SCOM/Solarwinds/Datadog/Nagios etc.) Administration of monitoring tools would be good to have skill Responsibilities: Provide L1+ support for team deliverables. Configuration, allocation, and management of centralized storage systems and other deployed storage solutions Help develop or engineer new solutions related to data storage and access Help setup of new storage technologies - tapeless, cloud storage etc. Be able to take care of most of P2/P3/P4 tasks and document the RCCA. Interacting with client over calls and provide details and reasoning of the deliverables. Troubleshooting experience and ability to fulfill L2 support Good knowledge on multiple Backup and Storage Systems like: Knowledge of Commvault, Rubrik, Veeam, r etc EMC, Netapp, Oracle Storage appliences, IBM SAN. etc. Experience with monitoring and performance tuning of various storage systems Experience with capacity planning and storage allocation forecasting Experience in storage Operations management, backup systems, SAN, NAS, DR Working knowledge of systems from major data storage vendors and cloud services Working knowledge of technical aspects of storage systems - including but not limited to DAS, NAS, Fiber Channel, RAID config, LUN Masking Basic Scripting Good knowledge in Windows and Unix OS Working Knowledge of VMware/Hyper-V. Job Requirements Responsibilities: Provide L1+ support for team deliverables. Configuration, allocation, and management of centralized storage systems and other deployed storage solutions Help develop or engineer new solutions related to data storage and access Help setup of new storage technologies - tapeless, cloud storage etc. Be able to take care of most of P2/P3/P4 tasks and document the RCCA. Interacting with client over calls and provide details and reasoning of the deliverables. Troubleshooting experience and ability to fulfill L2 support Good knowledge on multiple Backup and Storage Systems like: Knowledge of Commvault, Rubrik, Veeam, r etc EMC, Netapp, Oracle Storage appliences, IBM SAN. etc. Experience with monitoring and performance tuning of various storage systems Experience with capacity planning and storage allocation forecasting Experience in storage Operations management, backup systems, SAN, NAS, DR Working knowledge of systems from major data storage vendors and cloud services Working knowledge of technical aspects of storage systems - including but not limited to DAS, NAS, Fiber Channel, RAID config, LUN Masking Basic Scripting Good knowledge in Windows and Unix OS Working Knowledge of VMware/Hyper-V. Show more Show less

Posted 1 month ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: L1 Storage Administrator Location: Hyderabad / Navi Mumbai Experience: 1–2 Years Employment Type: Full-time Job Summary: We are looking for an enthusiastic and detail-oriented L1 Storage Administrator to provide first-level support for storage infrastructure. The ideal candidate will assist in monitoring, basic troubleshooting, and operational tasks for enterprise storage systems. Key Responsibilities: Monitor storage devices and alerts through tools and dashboards. Perform basic health checks and log reviews for SAN/NAS environments. Coordinate with L2/L3 teams for incident escalation. Maintain asset inventory, storage usage reports, and documentation. Support routine tasks like tape management, backup monitoring, and scheduled maintenance. Requirements: Basic understanding of storage systems (NetApp, EMC, HPE, etc.). Familiarity with backup software (Commvault, Veritas, etc.). Knowledge of ITIL processes (incident, change, and problem management). Willingness to work in shifts and on-call rotations. Job Title: L2 Storage Administrator Location: Hyderabad / Navi Mumbai Experience: 3–5 Years Employment Type: Full-time Job Summary: We are seeking a skilled L2 Storage Administrator responsible for the day-to-day operations, configuration, and troubleshooting of enterprise storage systems. Key Responsibilities: Perform provisioning and de-provisioning of SAN/NAS storage. Troubleshoot medium-complexity storage and backup issues. Execute and validate backup/restore tasks and replication jobs. Work closely with server, network, and application teams to resolve performance issues. Implement data migration, upgrades, and firmware updates under L3 guidance. Requirements: Experience with enterprise storage platforms (NetApp, Dell EMC, HPE 3PAR/Primera). Hands-on experience with backup and recovery solutions (e.g., Commvault, Veeam). Basic scripting knowledge (PowerShell, Bash) is a plus. Strong knowledge of zoning, LUN masking, and volume management. Ability to create detailed documentation and reports. Job Title: L3 Storage Administrator Location: Hyderabad / Navi Mumbai Experience: 6–10+ Years Employment Type: Full-time Job Summary: We are hiring a highly experienced L3 Storage Administrator to design, manage, and troubleshoot complex storage infrastructures. You will serve as the highest level of technical escalation and work on capacity planning, architecture, and performance optimization. Key Responsibilities: Lead the architecture and implementation of storage solutions. Perform root cause analysis and provide permanent resolutions for critical incidents. Design and implement DR/BCP strategies and data lifecycle management. Collaborate with vendors for escalations, patches, and feature enhancements. Evaluate and recommend new storage technologies and tools. Mentor L1/L2 admins and contribute to SOP development. Requirements: Deep expertise in enterprise storage (EMC VMAX, NetApp AFF, HPE 3PAR, Pure Storage). Strong knowledge of SAN fabric (Brocade/Cisco), multipathing, and zoning. Proficient in backup/recovery and disaster recovery technologies. Scripting and automation (PowerShell, Python) experience preferred. IT certifications (e.g., NetApp NCDA, EMC Proven, SNIA, Veeam) highly preferred. How to Apply: 📧 Send your updated resume to: latha.a@zettamine.com Please include the following in your email: Full Name Contact Number Total Experience Relevant Experience Current CTC Expected CTC Notice Period #storageadmin #L1 #L2 #L3 #storage Show more Show less

Posted 1 month ago

Apply

9.0 years

5 - 10 Lacs

Thiruvananthapuram

On-site

9 - 12 Years 1 Opening Trivandrum Role description Role Proficiency: Leverage expertise in a technology area (e.g. Infromatica Transformation Terradata data warehouse Hadoop Analytics) Responsible for Architecture for a small/mid-size projects. Outcomes: Implement either data extract and transformation a data warehouse (ETL Data Extracts Data Load Logic Mapping Work Flows stored procedures data warehouse) data analysis solution data reporting solutions or cloud data tools in any one of the cloud providers(AWS/AZURE/GCP) Understand business workflows and related data flows. Develop design for data acquisitions and data transformation or data modelling; applying business intelligence on data or design data fetching and dashboards Design information structure work-and dataflow navigation. Define backup recovery and security specifications Enforce and maintain naming standards and data dictionary for data models Provide or guide team to perform estimates Help team to develop proof of concepts (POC) and solution relevant to customer problems. Able to trouble shoot problems while developing POCs Architect/Big Data Speciality Certification in (AWS/AZURE/GCP/General for example Coursera or similar learning platform/Any ML) Measures of Outcomes: Percentage of billable time spent in a year for developing and implementing data transformation or data storage Number of best practices documented in any new tool and technology emerging in the market Number of associates trained on the data service practice Outputs Expected: Strategy & Planning: Create or contribute short-term tactical solutions to achieve long-term objectives and an overall data management roadmap Implement methods and procedures for tracking data quality completeness redundancy and improvement Ensure that data strategies and architectures meet regulatory compliance requirements Begin engaging external stakeholders including standards organizations regulatory bodies operators and scientific research communities or attend conferences with respect to data in cloud Operational Management : Help Architects to establish governance stewardship and frameworks for managing data across the organization Provide support in implementing the appropriate tools software applications and systems to support data technology goals Collaborate with project managers and business teams for all projects involving enterprise data Analyse data-related issues with systems integration compatibility and multi-platform integration Project Control and Review : Provide advice to teams facing complex technical issues in the course of project delivery Define and measure project and program specific architectural and technology quality metrics Knowledge Management & Capability Development : Publish and maintain a repository of solutions best practices and standards and other knowledge articles for data management Conduct and facilitate knowledge sharing and learning sessions across the team Gain industry standard certifications on technology or area of expertise Support technical skill building (including hiring and training) for the team based on inputs from project manager /RTE’s Mentor new members in the team in technical areas Gain and cultivate domain expertise to provide best and optimized solution to customer (delivery) Requirement gathering and Analysis: Work with customer business owners and other teams to collect analyze and understand the requirements including NFRs/define NFRs Analyze gaps/ trade-offs based on current system context and industry practices; clarify the requirements by working with the customer Define the systems and sub-systems that define the programs People Management: Set goals and manage performance of team engineers Provide career guidance to technical specialists and mentor them Alliance Management: Identify alliance partners based on the understanding of service offerings and client requirements In collaboration with Architect create a compelling business case around the offerings Conduct beta testing of the offerings and relevance to program Technology Consulting: In collaboration with Architects II and III analyze the application and technology landscapers process and tolls to arrive at the architecture options best fit for the client program Analyze Cost Vs Benefits of solution options Support Architects II and III to create a technology/ architecture roadmap for the client Define Architecture strategy for the program Innovation and Thought Leadership: Participate in internal and external forums (seminars paper presentation etc) Understand clients existing business at the program level and explore new avenues to save cost and bring process efficiency Identify business opportunities to create reusable components/accelerators and reuse existing components and best practices Project Management Support: Assist the PM/Scrum Master/Program Manager to identify technical risks and come-up with mitigation strategies Stakeholder Management: Monitor the concerns of internal stakeholders like Product Managers & RTE’s and external stakeholders like client architects on Architecture aspects. Follow through on commitments to achieve timely resolution of issues Conduct initiatives to meet client expectations Work to expand professional network in the client organization at team and program levels New Service Design: Identify potential opportunities for new service offerings based on customer voice/ partner inputs Conduct beta testing / POC as applicable Develop collaterals guides for GTM Skill Examples: Use data services knowledge creating POC to meet a business requirements; contextualize the solution to the industry under guidance of Architects Use technology knowledge to create Proof of Concept (POC) / (reusable) assets under the guidance of the specialist. Apply best practices in own area of work helping with performance troubleshooting and other complex troubleshooting. Define decide and defend the technology choices made review solution under guidance Use knowledge of technology t rends to provide inputs on potential areas of opportunity for UST Use independent knowledge of Design Patterns Tools and Principles to create high level design for the given requirements. Evaluate multiple design options and choose the appropriate options for best possible trade-offs. Conduct knowledge sessions to enhance team's design capabilities. Review the low and high level design created by Specialists for efficiency (consumption of hardware memory and memory leaks etc.) Use knowledge of Software Development Process Tools & Techniques to identify and assess incremental improvements for software development process methodology and tools. Take technical responsibility for all stages in the software development process. Conduct optimal coding with clear understanding of memory leakage and related impact. Implement global standards and guidelines relevant to programming and development come up with 'points of view' and new technological ideas Use knowledge of Project Management & Agile Tools and Techniques to support plan and manage medium size projects/programs as defined within UST; identifying risks and mitigation strategies Use knowledge of Project Metrics to understand relevance in project. Collect and collate project metrics and share with the relevant stakeholders Use knowledge of Estimation and Resource Planning to create estimate and plan resources for specific modules or small projects with detailed requirements or user stories in place Strong proficiencies in understanding data workflows and dataflow Attention to details High analytical capabilities Knowledge Examples: Data visualization Data migration RDMSs (relational database management systems SQL Hadoop technologies like MapReduce Hive and Pig. Programming languages especially Python and Java Operating systems like UNIX and MS Windows. Backup/archival software. Additional Comments: Snowflake Architect Key Responsibilities: • Solution Design: Designing the overall data architecture within Snowflake, including database/schema structures, data flow patterns (ELT/ETL strategies involving Snowflake), and integration points with other systems (source systems, BI tools, data science platforms). • Data Modeling: Designing efficient and scalable physical data models within Snowflake. Defining table structures, distribution/clustering keys, data types, and constraints to optimize storage and query performance. • Security Architecture: Designing the overall security framework, including the RBAC strategy, data masking policies, encryption standards, and how Snowflake security integrates with broader enterprise security policies. • Performance and Scalability Strategy: Designing solutions with performance and scalability in mind. Defining warehouse sizing strategies, query optimization patterns, and best practices for development teams. Ensuring the architecture can handle future growth in data volume and user concurrency. • Cost Optimization Strategy: Designing architectures that are inherently cost-effective. Making strategic choices about data storage, warehouse usage patterns, and feature utilization (e.g., when to use materialized views, streams, tasks). • Technology Evaluation and Selection: Evaluating and recommending specific Snowflake features (e.g., Snowpark, Streams, Tasks, External Functions, Snowpipe) and third-party tools (ETL/ELT, BI, governance) that best fit the requirements. • Standards and Governance: Defining best practices, naming conventions, development guidelines, and governance policies for using Snowflake effectively and consistently across the organization. • Roadmap and Strategy: Aligning the Snowflake data architecture with overall business intelligence and data strategy goals. Planning for future enhancements and platform evolution. • Technical Leadership: Providing guidance and mentorship to developers, data engineers, and administrators working with Snowflake. • Key Skills: • Deep understanding of Snowflake's advanced features and architecture. • Strong data warehousing concepts and data modeling expertise. • Solution architecture and system design skills. • Experience with cloud platforms (AWS, Azure, GCP) and how Snowflake integrates. • Expertise in performance tuning principles and techniques at an architectural level. • Strong understanding of data security principles and implementation patterns. • Knowledge of various data integration patterns (ETL, ELT, Streaming). • Excellent communication and presentation skills to articulate designs to technical and non-technical audiences. • Strategic thinking and planning abilities. Looking for 12+ years of experience to join our team. Skills Snowflake,Data modeling,Cloud platforms,Solution architecture About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 1 month ago

Apply

4.0 - 7.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Introduction In this role, you will work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. You will collaborate with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you will be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground-breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience Your Role And Responsibilities A Data Engineer specializing in enterprise data platforms, experienced in building, managing, and optimizing data pipelines for large-scale environments. Having expertise in big data technologies, distributed computing, data ingestion, and transformation frameworks. Proficient in Apache Spark, PySpark, Kafka, and Iceberg tables, and understand how to design and implement scalable, high-performance data processing solutions.What you’ll do: As a Data Engineer – Data Platform Services, responsibilities include: Data Ingestion & Processing Designing and developing data pipelines to migrate workloads from IIAS to Cloudera Data Lake. Implementing streaming and batch data ingestion frameworks using Kafka, Apache Spark (PySpark). Working with IBM CDC and Universal Data Mover to manage data replication and movement. Big Data & Data Lakehouse Management Implementing Apache Iceberg tables for efficient data storage and retrieval. Managing distributed data processing with Cloudera Data Platform (CDP). Ensuring data lineage, cataloging, and governance for compliance with Bank/regulatory policies. Optimization & Performance Tuning Optimizing Spark and PySpark jobs for performance and scalability. Implementing data partitioning, indexing, and caching to enhance query performance. Monitoring and troubleshooting pipeline failures and performance bottlenecks. Security & Compliance Ensuring secure data access, encryption, and masking using Thales CipherTrust. Implementing role-based access controls (RBAC) and data governance policies. Supporting metadata management and data quality initiatives. Collaboration & Automation Working closely with Data Scientists, Analysts, and DevOps teams to integrate data solutions. Automating data workflows using Airflow and implementing CI/CD pipelines with GitLab and Sonatype Nexus. Supporting Denodo-based data virtualization for seamless data access Preferred Education Master's Degree Required Technical And Professional Expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks Preferred Technical And Professional Experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics Show more Show less

Posted 1 month ago

Apply

4.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

We are seeking a skilled *Data Masking Engineer* with 4-5 years of experience in *SQL Server* and *Redgate tools* to design, implement, and manage data masking solutions. The ideal candidate will ensure sensitive data is protected while maintaining database usability for development, testing, and analytics. The candidate should be ready to relocate to Johannesburg South Africa at the earliest possible. Responsibilities Design and implement *data masking strategies* for SQL Server databases to comply with security and privacy regulations (GDPR, HIPAA, etc.). - Use *Redgate Data Masker* and other tools to anonymize sensitive data while preserving referential integrity. - Develop and maintain masking rules, scripts, and automation workflows for efficient data obfuscation. - Collaborate with *DBAs, developers, and security teams* to identify sensitive data fields and define masking policies. - Validate masked data to ensure consistency, usability, and compliance with business requirements. - Troubleshoot and optimize masking processes to minimize performance impact on production and non-production environments. - Document masking procedures, policies, and best practices for internal teams. - Stay updated with *Redgate tool updates, SQL Server features, and data security trends*. Qualifications: 4-5 years of hands-on experience in SQL Server database development/administration . - Strong expertise in *Redgate Data Masker* or similar data masking tools (e.g., Delphix, Informatica). - Proficiency in *T-SQL, PowerShell, or Python* for scripting and automation. - Knowledge of *data privacy laws (GDPR, CCPA)* and secure data handling practices. - Experience with *SQL Server security features* (Dynamic Data Masking, Always Encrypted, etc.) is a plus. - Familiarity with *DevOps/CI-CD pipelines* for automated masking in development/test environments. - Strong analytical skills to ensure masked data remains realistic for testing. Preferred Qualifications: - Redgate or Microsoft SQL Server certifications. - Experience with *SQL Server Integration Services (SSIS)* or ETL processes. - Knowledge of *cloud databases (Azure SQL, AWS RDS)* and their masking solutions. Show more Show less

Posted 1 month ago

Apply

8.0 - 11.0 years

6 - 9 Lacs

Noida

On-site

Snowflake - Senior Technical Lead Full-time Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 50,000 employees in nearly 30 countries, is recognised for its consulting, digital services and solutions. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organisations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2024, the Group generated revenues of €5.8 billion. The world is how we shape it. Job Description Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Preferred Skills Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. Qualifications BTech/MCA Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities.

Posted 1 month ago

Apply

12.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Role Proficiency: Leverage expertise in a technology area (e.g. Infromatica Transformation Terradata data warehouse Hadoop Analytics) Responsible for Architecture for a small/mid-size projects. Outcomes Implement either data extract and transformation a data warehouse (ETL Data Extracts Data Load Logic Mapping Work Flows stored procedures data warehouse) data analysis solution data reporting solutions or cloud data tools in any one of the cloud providers(AWS/AZURE/GCP) Understand business workflows and related data flows. Develop design for data acquisitions and data transformation or data modelling; applying business intelligence on data or design data fetching and dashboards Design information structure work-and dataflow navigation. Define backup recovery and security specifications Enforce and maintain naming standards and data dictionary for data models Provide or guide team to perform estimates Help team to develop proof of concepts (POC) and solution relevant to customer problems. Able to trouble shoot problems while developing POCs Architect/Big Data Speciality Certification in (AWS/AZURE/GCP/General for example Coursera or similar learning platform/Any ML) Measures Of Outcomes Percentage of billable time spent in a year for developing and implementing data transformation or data storage Number of best practices documented in any new tool and technology emerging in the market Number of associates trained on the data service practice Outputs Expected Strategy & Planning: Create or contribute short-term tactical solutions to achieve long-term objectives and an overall data management roadmap Implement methods and procedures for tracking data quality completeness redundancy and improvement Ensure that data strategies and architectures meet regulatory compliance requirements Begin engaging external stakeholders including standards organizations regulatory bodies operators and scientific research communities or attend conferences with respect to data in cloud Operational Management Help Architects to establish governance stewardship and frameworks for managing data across the organization Provide support in implementing the appropriate tools software applications and systems to support data technology goals Collaborate with project managers and business teams for all projects involving enterprise data Analyse data-related issues with systems integration compatibility and multi-platform integration Project Control And Review Provide advice to teams facing complex technical issues in the course of project delivery Define and measure project and program specific architectural and technology quality metrics Knowledge Management & Capability Development Publish and maintain a repository of solutions best practices and standards and other knowledge articles for data management Conduct and facilitate knowledge sharing and learning sessions across the team Gain industry standard certifications on technology or area of expertise Support technical skill building (including hiring and training) for the team based on inputs from project manager /RTE’s Mentor new members in the team in technical areas Gain and cultivate domain expertise to provide best and optimized solution to customer (delivery) Requirement Gathering And Analysis Work with customer business owners and other teams to collect analyze and understand the requirements including NFRs/define NFRs Analyze gaps/ trade-offs based on current system context and industry practices; clarify the requirements by working with the customer Define the systems and sub-systems that define the programs People Management Set goals and manage performance of team engineers Provide career guidance to technical specialists and mentor them Alliance Management Identify alliance partners based on the understanding of service offerings and client requirements In collaboration with Architect create a compelling business case around the offerings Conduct beta testing of the offerings and relevance to program Technology Consulting In collaboration with Architects II and III analyze the application and technology landscapers process and tolls to arrive at the architecture options best fit for the client program Analyze Cost Vs Benefits of solution options Support Architects II and III to create a technology/ architecture roadmap for the client Define Architecture strategy for the program Innovation And Thought Leadership Participate in internal and external forums (seminars paper presentation etc) Understand clients existing business at the program level and explore new avenues to save cost and bring process efficiency Identify business opportunities to create reusable components/accelerators and reuse existing components and best practices Project Management Support Assist the PM/Scrum Master/Program Manager to identify technical risks and come-up with mitigation strategies Stakeholder Management Monitor the concerns of internal stakeholders like Product Managers & RTE’s and external stakeholders like client architects on Architecture aspects. Follow through on commitments to achieve timely resolution of issues Conduct initiatives to meet client expectations Work to expand professional network in the client organization at team and program levels New Service Design Identify potential opportunities for new service offerings based on customer voice/ partner inputs Conduct beta testing / POC as applicable Develop collaterals guides for GTM Skill Examples Use data services knowledge creating POC to meet a business requirements; contextualize the solution to the industry under guidance of Architects Use technology knowledge to create Proof of Concept (POC) / (reusable) assets under the guidance of the specialist. Apply best practices in own area of work helping with performance troubleshooting and other complex troubleshooting. Define decide and defend the technology choices made review solution under guidance Use knowledge of technology t rends to provide inputs on potential areas of opportunity for UST Use independent knowledge of Design Patterns Tools and Principles to create high level design for the given requirements. Evaluate multiple design options and choose the appropriate options for best possible trade-offs. Conduct knowledge sessions to enhance team's design capabilities. Review the low and high level design created by Specialists for efficiency (consumption of hardware memory and memory leaks etc.) Use knowledge of Software Development Process Tools & Techniques to identify and assess incremental improvements for software development process methodology and tools. Take technical responsibility for all stages in the software development process. Conduct optimal coding with clear understanding of memory leakage and related impact. Implement global standards and guidelines relevant to programming and development come up with 'points of view' and new technological ideas Use knowledge of Project Management & Agile Tools and Techniques to support plan and manage medium size projects/programs as defined within UST; identifying risks and mitigation strategies Use knowledge of Project Metrics to understand relevance in project. Collect and collate project metrics and share with the relevant stakeholders Use knowledge of Estimation and Resource Planning to create estimate and plan resources for specific modules or small projects with detailed requirements or user stories in place Strong proficiencies in understanding data workflows and dataflow Attention to details High analytical capabilities Knowledge Examples Data visualization Data migration RDMSs (relational database management systems SQL Hadoop technologies like MapReduce Hive and Pig. Programming languages especially Python and Java Operating systems like UNIX and MS Windows. Backup/archival software. Additional Comments Snowflake Architect Key Responsibilities: Solution Design: Designing the overall data architecture within Snowflake, including database/schema structures, data flow patterns (ELT/ETL strategies involving Snowflake), and integration points with other systems (source systems, BI tools, data science platforms). Data Modeling: Designing efficient and scalable physical data models within Snowflake. Defining table structures, distribution/clustering keys, data types, and constraints to optimize storage and query performance. Security Architecture: Designing the overall security framework, including the RBAC strategy, data masking policies, encryption standards, and how Snowflake security integrates with broader enterprise security policies. Performance and Scalability Strategy: Designing solutions with performance and scalability in mind. Defining warehouse sizing strategies, query optimization patterns, and best practices for development teams. Ensuring the architecture can handle future growth in data volume and user concurrency. Cost Optimization Strategy: Designing architectures that are inherently cost-effective. Making strategic choices about data storage, warehouse usage patterns, and feature utilization (e.g., when to use materialized views, streams, tasks). Technology Evaluation and Selection: Evaluating and recommending specific Snowflake features (e.g., Snowpark, Streams, Tasks, External Functions, Snowpipe) and third-party tools (ETL/ELT, BI, governance) that best fit the requirements. Standards and Governance: Defining best practices, naming conventions, development guidelines, and governance policies for using Snowflake effectively and consistently across the organization. Roadmap and Strategy: Aligning the Snowflake data architecture with overall business intelligence and data strategy goals. Planning for future enhancements and platform evolution. Technical Leadership: Providing guidance and mentorship to developers, data engineers, and administrators working with Snowflake. Key Skills: Deep understanding of Snowflake's advanced features and architecture. Strong data warehousing concepts and data modeling expertise. Solution architecture and system design skills. Experience with cloud platforms (AWS, Azure, GCP) and how Snowflake integrates. Expertise in performance tuning principles and techniques at an architectural level. Strong understanding of data security principles and implementation patterns. Knowledge of various data integration patterns (ETL, ELT, Streaming). Excellent communication and presentation skills to articulate designs to technical and non-technical audiences. Strategic thinking and planning abilities. Looking for 12+ years of experience to join our team. Skills Snowflake,Data modeling,Cloud platforms,Solution architecture Show more Show less

Posted 1 month ago

Apply

8.0 - 11.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 50,000 employees in nearly 30 countries, is recognised for its consulting, digital services and solutions. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organisations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2024, the Group generated revenues of €5.8 billion. Job Description The world is how we shape it. Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Preferred Skills Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. Qualifications BTech/MCA Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Delhi, India

On-site

Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude. Show more Show less

Posted 1 month ago

Apply

0.0 years

0 Lacs

Gurugram, Haryana

On-site

202408432 Gurugram, Haryana, India Thane, Maharashtra, India Bevorzugt Description The Role: Partner with other architecture resources to lead the end-to-end architecture of the health and benefits data platform using Azure services, ensuring scalability, flexibility, and reliability. Develop broad understanding of the data lake architecture, including the impact of changes on a whole system, the onboarding of clients and the security implications of the solution. Design a new or improve upon existing architecture including data ingestion, storage, transformation and consumption layers. Define data models, schemas, and database structures optimized for H&B use cases including claims, census, placement, broking and finance sources. Designing solutions for seamless integration of diverse health and benefits data sources. Implement data governance and security best practices in compliance with industry standards and regulations using Microsoft Purview. Evaluate data lake architecture to understand how technical decisions may impact business outcomes and suggest new solutions/technologies that better align to the Health and Benefits Data strategy. Draw on internal and external practices to establish data lake architecture best practices and standards within the team and ensure that they are shared and understood. Continuously develop technical knowledge and be recognised as a key resource across the global team. Collaborate with other specialists and/or technical experts to ensure H&B Data Platform is delivering to the highest possible standards and that solutions support stakeholder needs and business requirements. Initiate practices that will increase code quality, performance and security. Develop recommendations for continuous improvements initiatives, applying deep subject matter knowledge to provide guidance at all levels on the potential implications of changes. Build the team’s technical expertise/capabilities/skills through the delivery of regular feedback, knowledge sharing, and coaching. High learning adaptability, demonstrating understanding of the implications of technical issues on business requirements and / or operations. Analyze existing data design and suggest improvements that promote performance, stability and interoperability. Work with product management and business subject matter experts to translate business requirements into good data lake design. Maintain the governance model on the data lake architecture through training, design reviews, code reviews, and progress reviews. Participate in the development of Data lake Architecture and Roadmaps in support of business strategies Communication with key stakeholders and development teams on technical solutions. Convince and present proposals by way of high-level solutions to end users and/or stakeholders. The Requirement: Candidate must have significant experience in a technology related discipline, such as IT or Engineering with a Bachelor's/College Degree in these areas being beneficial. Strong experience in databases, tools and methodologies Strong skills across a broad range of database technologies including Azure Data Factory, Azure Synapse Analytics, Azure Databricks, Azure SQL Database, Azure Data Lake Storage, and other Azure Services. Working knowledge of Microsoft Fabric is preferred. Data Analysis, Data Modeling, Data Integration, Data Warehousing, Database Design Experience with database performance evaluation and remediation Develop strategies for data acquisitions, archive recovery and implementation Be able to design and develop Databases, Data Warehouses and Multidimensional Databases Experience in Data Governance including Microsoft Purview, Azure Data Catalogue, Azure Data Share, and other Azure tools. Familiarity with legal risks related to data usage and rights. Experience in data security, including Azure Key Vault, Azure Data Encryption, Azure Data Masking, Azure Data Anonymization, and Azure Active Directory. Ability to develop database strategies for flexible high-performance reporting and business intelligence Experience using data modeling tools & methodology Experience working within an Agile Scrum Development Life Cycle, across varying levels of Agile maturity Experience working with geographically distributed scrum teams Excellent verbal and writing skills, including the ability to research, design, and write new documentation, as well as to maintain and improve existing material Technical competencies: Subject Matter Expertise Developing expertise You strengthen your depth and/or breadth of subject matter knowledge and skills across multiple areas. You define the expertise required in your area based on emerging technologies, industry practices. You build the team’s capability accordingly. Applying expertise You apply subject matter knowledge and skills across multiple areas to assess the impact of complex issues and implement long-term solutions. You foster innovation using subject matter knowledge to enhance tools, practices, and processes for the team. Solution Development Systems thinking You lead and foster collaboration across H&B Data Platform Technology to develop solutions to complex issues. You apply a whole systems approach to evaluating impact, and take ownership for ensuring links between structure, people and processes are made. Focusing on quality You instill a quality mindset to the team and ensure the appropriate methods, processes and standards are in place for teams to deliver quality solutions. You create and deliver improvement initiatives. Technical Communication Simplifying complexity You develop tools, aids and/or original content to support the delivery and/or understanding of complex information. You guide others on best practice. Qualifications Candidate must have significant experience in a technology related discipline, such as IT or Engineering with a Bachelor's/College Degree in these areas being beneficial.

Posted 1 month ago

Apply

9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description Job Title: Lead - Data Analyst The Purpose of this Role The Workplace Investment (WI) Test Data Management CoE is seeking an experienced Data Analyst to lead data exploration, reporting, and quality initiatives supporting test environments. You will analyze trends, detect anomalies, and provide actionable insights to help improve test data availability, accuracy, and data/ environment stability. You will partner with engineering, QA, and TDM teams to ensure data decisions are grounded in reliable, timely information. The Expertise And Skills You Bring 6–9 years of experience as a Data Analyst, preferably in a testing, QA, or engineering environment Strong SQL skills and experience working with large, complex datasets Experience with reporting tools (Tableau, Power BI, or similar) You must possess strong background on PL/SQL, Informatica, Snowflake Ability to investigate data issues, identify patterns, and explain root causes Familiarity with test data concepts, data masking, or data provisioning processes is a plus Strong communication and data storytelling skills — can explain findings to technical and non-technical teams Experience working with Agile teams, JIRA, and documentation tools (e.g., Confluence) Exposure to cloud platforms (AWS, Azure) and scripting languages (e.g., Python) is desirable Passion for data quality, consistency, and driving better decisions with facts The Value You Deliver You perform in-depth analysis of test data patterns, usage, and gaps across environments You detect anomalies or inconsistencies in data and help teams address root causes You build clear, insightful dashboards and reports to support data-driven decision-making You partner with QE, TDM, and engineering teams to define test data quality metrics You support test data compliance and masking efforts by identifying sensitive data across sources You contribute to environment stability by flagging issues early and helping teams prioritize fixes You document insights, trends, and recurring data issues to support long-term improvements How Your Work Impacts The Organization The Team The WI QE Center of Excellence (CoE) drives engineering excellence and data/environment stability across WI squads. As a Data Analyst, you’ll play a key role in ensuring test data is trustworthy, visible, and ready to support rapid, high-quality testing. Your insights will help shape how we build, monitor, and evolve our test data ecosystem. Location: TRIL, Chennai Timing: 11AM to 8PM Certifications Category: Information Technology Show more Show less

Posted 1 month ago

Apply

1.0 - 4.0 years

1 - 5 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Informatica TDM 1. Data Discovery, Data Subset and Data Masking 2. Data generation 3. Complex masking and genration rule creation 4. Performance tunning for Inoformatica Mapping 5. Debugging with Informatica power center 6. Data Migration Skills and Knowledge 1. Informatica TDM development experiance in Data masking, discovery, Data subsetting and Data Generation is must 2. Should have experiance in working with flat file, MS SQL, Oracle and Snowflake 3. Debugging using Inofmratica Power center 4. Experiance in Tableau will be added advantage 5. Should have basic knowledge about IICS" 6. Must have and Good to have skills Informatica TDM, SQL, Informatica Powercenter, GDPR

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 3 to 6 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

0.0 - 1.0 years

0 - 3 Lacs

Noida

Work from Office

- Knowledge of Clipping Path (pen tool) is must - Selection tool, Color Correction, Cropping, Retouching, Re-sizing basic knowledge - Should be comfortable to work in day/night shifts (extra perks for extra hrs) Required Candidate profile - Ready to learn new things - Fresher can apply - Knowledge of Adobe Photoshop-pen tool is must - Good incentives for working extra hrs

Posted 1 month ago

Apply

20.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Senior Data Solution Architect Job Summary: The Senior Data Solution Architect is a visionary and technical leader responsible for designing and guiding enterprise-scale data solutions. Leveraging 20+ years of experience, this individual works closely with business and IT stakeholders to deliver scalable, secure, and high-performing data architectures that support strategic goals, data-driven innovation, and digital transformation. This role encompasses solution design, platform modernization, cloud data architecture, and deep integration with enterprise systems. Key Responsibilities: Solution Architecture & Design Lead the end-of-the-end architecture of complex data solutions across domains including analytics, AI/ML, MDM, and real-time processing. Design robust, scalable, and future-ready data architectures using modern technologies (e.g., cloud data platforms, streaming, NoSQL, graph databases). Deliver solutions that balance performance, scalability, security, and cost-efficiency. Enterprise Data Integration Architect seamless data integration across legacy systems, SaaS platforms, IoT, APIs, and third-party data sources. Define and implement enterprise-wide ETL/ELT strategies using tools like Informatica, Talend, DBT, Azure Data Factory, or AWS Glue. Support real-time and event-driven architecture with tools such as Kafka, Spark Streaming, or Flink. Cloud Data Platforms & Infrastructure Design cloud-native data solutions on AWS, Azure, or GCP (e.g., Redshift, Snowflake, BigQuery, Databricks, Synapse). Lead cloud migration strategies from legacy systems to modern, cloud-based data architectures. Define standards for cloud data governance, cost management, and performance optimization. Data Governance, Security & Compliance Partner with governance teams to enforce enterprise data governance frameworks. Ensure solutions comply with regulations such as GDPR, HIPAA, CCPA, and industry-specific mandates. Embed security and privacy by design in data architectures (encryption, role-based access, masking, etc.). Technical Leadership & Stakeholder Engagement Serve as a technical advisor to CIOs, CDOs, and senior business executives on data strategy and platform decisions. Mentor architecture and engineering teams; provide guidance on solution patterns and best practices. Facilitate architecture reviews, proof-of-concepts (POCs), and technology evaluations. Innovation & Continuous Improvement Stay abreast of emerging trends in data engineering, AI, data mesh, data fabric, and edge computing. Evaluate and introduce innovative tools and patterns (e.g., serverless data pipelines, federated data access). Drive architectural modernization, legacy decommissioning, and platform simplification. Qualifications: Education: Bachelor’s degree in computer science, Engineering, Information Systems, or related field; Master’s or MBA preferred. Experience: 20+ years in IT with at least 10 years in data architecture or solution architecture roles. Demonstrated experience in large-scale, complex data platform architecture and enterprise transformations. Deep experience with multiple database technologies (SQL, NoSQL, columnar, time series). Strong programming/scripting background (e.g., Python, Scala, Java, SQL). Proven experience architecting on at least one major cloud provider (AWS, Azure, GCP). Familiarity with DevOps, CI/CD, and DataOps practices. Preferred Certifications: AWS/Azure/GCP Solution Architect (Professional level preferred) TOGAF or Zachman Framework Certification Snowflake/Databricks Certified Architect CDMP (Certified Data Management Professional) or DGSP Key Competencies: Strategic and conceptual thinking with the ability to translate business needs into technical solutions. Exceptional communication, presentation, and negotiation skills. Leadership in cross-functional teams and matrix environments. Deep understanding of business processes, data monetization, and digital strategy. Success Indicators: Delivery of transformative data platforms that enhance analytics and decision-making. Improved data integration, quality, and access across the enterprise. Successful migration to cloud-native or hybrid architectures. Reduction of technical debt and legacy system dependencies. Increased reuse of solution patterns, accelerators, and frameworks. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners. Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude. Show more Show less

Posted 1 month ago

Apply

40.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description The candidate must have strong troubleshooting skills on Database and Database technology products Expertise in Performance issue analysis and providing resolution Guide customer on Oracle Database Best practices Should possess knowledge on implementation and supporting on Database Security Products like Transparent Data Encryption, Redaction, Data Vault, Masking. Possess strong troubleshooting skills on Real Application Cluster Should be able to guide and mentor team of engineers on Database Technology products Should possess knowledge and be able to articulate to customer the use cases of Advanced Compression, In-memory Knowledge on Oracle Enterprise Manager Personal Skills Strong experience in service delivery and/or project management is required. Oracle products and services knowledge will be highly appreciated as well as experience in Oracle HW platforms and OS. Experience on Enterprise Customers is required Excellent communication / relationship building skills Customer focused and results oriented Ability to work under pressure in highly escalated situations Organized with strong attention to detail Decision making / problem solving skills Ability to manage multiple concurrent activities (customer engagements) Highly professional: Ability to deal with senior and exec stakeholders with confidence Strong analytic skills and ability to pre-empt potential risks and issues Career Level - IC4 Responsibilities RESPONSIBILITIES Be the single point of contact within Oracle for the customer, acting as their advocate for the service you are responsible for delivering. The CSS TAM is a customer advocate and must demonstrate customer obsession by placing the client needs first. Provide technical guidance and be part of the customer calls/meeting on adoption of database technology Should possess strong technical skills on Database and DB products to advocate to customer the use cases and guide the customer and team of Oracle CSS Engineers through the lifecycle of Oracle Technology product adoption Manage the contract or delivery engagement as defined by ACS line management, including creating and maintaining accurate documentation Maintain the Oracle business systems to ensure systems are up to date with the correct/current information (resource assignment, timecards, rates, completion estimates, invoice details etc.) to ensure that services are delivered efficiently, invoices are generated in a timely manner and revenues are recognised promptly. Plan and deploy resources to ensure effective delivery within agreed budgetary constraints. Where appropriate create and maintain the ACS service delivery or project plan. Actively manage project forecast, identify risks and issues and opportunity for revenue collection (upside) Accountabilities: Proactively manage the contract delivery to completion / customer acceptance Proactively report on any potential risks / issues that may impact service delivery or customer satisfaction Manage any customer escalation that may arise Ensure all contract-related systems and documentation either required contractually or as part of a program, are up to date and accurate Monitor and report revenue forecast and margin estimates, revenue and margin achievements for each contract Work in line with customer working practices and procedures, if contractually agreed Operate in line with Oracle CSS business processes and procedures Operate in line with Oracle Global and local HR policies and procedures About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies