Jobs
Interviews

1581 Data Security Jobs - Page 43

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

10 - 14 Lacs

Bengaluru

Work from Office

We re looking for a Staff Product Designer to join Procore s Asset Management division which drives the intricacies of collaboration, planning, management, tracking, and monitoring of people, processes, equipment, materials, maps, and spatial experiences. Procore software solutions aim to improve the lives of everyone in construction and the people within Product & Technology are the driving force behind our innovative, top-rated global platform. We re a customer-centric group that encompasses engineering, product, product design and data, security and business systems. As a Staff Product Designer , you ll partner with Product and Engineering teams to create delightful, effortless experiences for our users. Youll shape our products by understanding our users needs and translating them into intuitive designs, validating them frequently during their path from concept to polished product. The person in this role will have worked on complex products in the past and is prepared to lead visionary, and cross-cutting solutions through to completion by spearheading research efforts with customers and creating comprehensive design solutions. If you constantly analyze and obsess over other product s experiences we d like to hear from you! This position reports into the Design Director for Resource & Asset Management and will be based in our Bangalore , India office. We re looking for someone to join us immediately. What you ll do: Design and validate new experiences via mockups, wireframes, flow diagrams, sketches, and other UX artifacts for our cloud-based applications Work as an embedded member of multiple cross-functional agile product development teams, working in partnership with Product Managers and Engineers to set the products strategy, and create solutions based on research Partner with in-house UX research experts to conduct generative research and usability tests with Procore users, both online and in-person at job sites around the country Advocate for the user and evangelize user experience throughout the organization Collaborate with other designers and product teams in your product area Promote a positive culture within your product team as well as your division, and overcome challenges through endurance, grit, and persistence Leverage and advocate for the patterns, content, and solutions created by our Design System, Content Strategy, UX Research, and DesignOps teams, and make contributions that provide value to the department What we re looking for: 7+ years of experience designing world-class apps with a strong portfolio showcasing your research process and design work Experience in complex B2B Enterprise/SaaS products Proficiency with leading design software such as Figma or Sketch and experience with prototyping tools Curiosity about the way people think and human behavior Excellent communication skills with a proven track record presenting designs to cross-functional teams and evangelizing UX to the organization Experience with iOS/Android design standards and passion for interaction design Experience partnering with Product and Engineering to achieve impactful outcomes in an agile product development environment Comfortable leveraging data to guide design decisions that exceed product and user outcomes

Posted 1 month ago

Apply

8.0 - 12.0 years

13 - 17 Lacs

Ahmedabad

Work from Office

DataArchitecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. DataModeling:Createand managelogical, physical, and conceptual data models to support various business applications and analytics. DatabaseDesign: Design and implement database solutions, including data warehouses, data lakes, and operational databases. DataIntegration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. DataGovernance:Implementand enforce data governance policies and procedures to ensure data quality, consistency, and security. TechnologyEvaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the world s leading brands Documentation:Createand maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. PerformanceTuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Requirements: Helpingproject teams withsolutions architecture,troubleshooting, and technical implementation assistance. Experiencewithbig data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertisewithcloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledgeofdataintegration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understandingofdatawarehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experiencewithdata governanceframeworks and tools.

Posted 1 month ago

Apply

15.0 - 18.0 years

8 - 11 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Experience: 15+ years overall | Minimum 10 full-cycle AI/ML project implementations , including GenAI experience Role Summary: We are seeking a AI Architect to lead strategic AI transformation initiatives. This role demands deep hands-on experience in AI, Machine Learning (ML), and Generative AI (GenAI) , along with the ability to engage directly with C-level stakeholders , align technical delivery with business objectives, and drive enterprise-wide adoption of advanced AI solutions. The ideal candidate is a techno-strategic leader who can take AI/ML/GenAI projects from ideation to production building architectures, leading cross-functional teams, and ensuring regulatory and operational alignment in BFSI environments. Key Consulting & Business Alignment Partner with senior business and IT leadership , including CIOs, CDOs, and COOs , to identify high-impact use cases across retail banking, insurance, credit, and capital markets. Translate complex BFSI challenges into technically feasible and scalable AI/ML/GenAI solutions. Create strategic roadmaps, capability assessments, and PoV/PoC execution plans that align with business KPIs and regulatory needs. Solution Architecture & Delivery Leadership Design and lead delivery of AI/ML/GenAI pipelines covering data ingestion, model training, validation, deployment, and monitoring. Build and scale GenAI-based solutions like LLM-driven chatbots, intelligent document processing, RAG pipelines, summarization tools , and virtual assistants. Architect cloud-native AI platforms using AWS (SageMaker, Bedrock) , Azure (ML, OpenAI) , or GCP (Vertex AI, BigQuery, LangChain) . Define and implement MLOps and LLMOps frameworks for versioning, retraining, CI/CD, and production observability. Ensure adherence to Responsible AI principles , including explainability, bias mitigation, auditability, and regulatory compliance Engineering & Integration Work closely with data engineering teams to acquire, transform, and pipeline data from core banking systems, CRMs, claims systems, and real-time feeds. Design architecture for data lakes, feature stores, and vector databases supporting AI and GenAI use cases. Enable seamless integration of AI capabilities into enterprise workflows, customer platforms, and decision engines via APIs and microservices. Required Skills & Experience: 15+ years of experience in AI/ML, data engineering, and cloud architecture. Minimum of 10 end-to-end AI/ML project implementations from use case discovery through to productionization. Proven expertise in: (Any One) AI/ML frameworks : scikit-learn, XGBoost, TensorFlow, PyTorch GenAI/LLM platforms : OpenAI, Cohere, Mistral, LangChain, Hugging Face, vector DBs (Pinecone, FAISS, Chroma) Cloud platforms : AWS, Azure, GCP - including AI/ML & GenAI native services MLOps/LLMOps tools : MLflow, Kubeflow, SageMaker Pipelines, Vertex AI Pipelines Strong experience with data security, governance, model risk management , and AI compliance frameworks relevant to BFSI. Ability to lead large cross-functional teams and engage both technical teams and senior stakeholders. Experience: 15+ years overall | Minimum 10 full-cycle AI/ML project implementations , including GenAI experience Role Summary: We are seeking a AI Architect to lead strategic AI transformation initiatives. This role demands deep hands-on experience in AI, Machine Learning (ML), and Generative AI (GenAI) , along with the ability to engage directly with C-level stakeholders , align technical delivery with business objectives, and drive enterprise-wide adoption of advanced AI solutions. The ideal candidate is a techno-strategic leader who can take AI/ML/GenAI projects from ideation to production building architectures, leading cross-functional teams, and ensuring regulatory and operational alignment in BFSI environments. Key Consulting & Business Alignment Partner with senior business and IT leadership , including CIOs, CDOs, and COOs , to identify high-impact use cases across retail banking, insurance, credit, and capital markets. Translate complex BFSI challenges into technically feasible and scalable AI/ML/GenAI solutions. Create strategic roadmaps, capability assessments, and PoV/PoC execution plans that align with business KPIs and regulatory needs. Solution Architecture & Delivery Leadership Design and lead delivery of AI/ML/GenAI pipelines covering data ingestion, model training, validation, deployment, and monitoring. Build and scale GenAI-based solutions like LLM-driven chatbots, intelligent document processing, RAG pipelines, summarization tools , and virtual assistants. Architect cloud-native AI platforms using AWS (SageMaker, Bedrock) , Azure (ML, OpenAI) , or GCP (Vertex AI, BigQuery, LangChain) . Define and implement MLOps and LLMOps frameworks for versioning, retraining, CI/CD, and production observability. Ensure adherence to Responsible AI principles , including explainability, bias mitigation, auditability, and regulatory compliance Engineering & Integration Work closely with data engineering teams to acquire, transform, and pipeline data from core banking systems, CRMs, claims systems, and real-time feeds. Design architecture for data lakes, feature stores, and vector databases supporting AI and GenAI use cases. Enable seamless integration of AI capabilities into enterprise workflows, customer platforms, and decision engines via APIs and microservices. Required Skills & Experience: 15+ years of experience in AI/ML, data engineering, and cloud architecture. Minimum of 10 end-to-end AI/ML project implementations from use case discovery through to productionization. Proven expertise in: (Any One) AI/ML frameworks : scikit-learn, XGBoost, TensorFlow, PyTorch GenAI/LLM platforms : OpenAI, Cohere, Mistral, LangChain, Hugging Face, vector DBs (Pinecone, FAISS, Chroma) Cloud platforms : AWS, Azure, GCP - including AI/ML & GenAI native services MLOps/LLMOps tools : MLflow, Kubeflow, SageMaker Pipelines, Vertex AI Pipelines Strong experience with data security, governance, model risk management , and AI compliance frameworks relevant to BFSI. Ability to lead large cross-functional teams and engage both technical teams and senior stakeholders.

Posted 1 month ago

Apply

8.0 - 9.0 years

25 - 30 Lacs

Chennai

Work from Office

Role: Oracle Database developer Location: Offshore/India Who are we looking for? We are looking for 8+ years of experience in Oracle Database Development, who will be responsible for the development, testing, and maintenance of Oracle databases . Technical Skills: 8+ year s experience developing PL/SQL procedures, functions, packages Must have experience using bulk loading methods - Oracle sqlldr, external tables Prior experience in performance tuning of database queries, materialized views Or similar (development environment) The ideal candidate will be capable of documenting small-to medium scale projects with minimal supervision, and may be involved in developing and delivering presentations Designs, develops, and supports solutions utilizing Oracle relational database management system Strong written and oral communication skills Excellent problem-solving and quantitative skills Demonstrated ability to work as part of a team. Responsibilities Design and implement efficient database structures for new and existing applications. Develop, test, and optimize complex SQL queries, functions, and stored procedures. Collaborate with software developers to ensure effective data flow and integration. Monitor, tune, and report on database performance and scalability. Ensure data security and integrity through proper access controls and backup procedures. Troubleshoot database-related issues and optimize performance . Qualification: Somebody who has at least 8+ years of experience in Oracle Database Development . Education qualification: Any degree from a reputed college

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Role Description: Senior Product Manager, SSE Threat Protection and Platform Services The Senior Product Manager, SSE Threat Protection and Platform Services will be responsible for defining and executing the roadmap for Skyhigh Security s Advanced Threat Protection, URL Filtering, DNS Security, IPS ensuring customer s networks and users are protected from evolving cyber threats by deeply understanding the threat landscape. In addition, the candidate is also responsible to oversee Platform Service capabilities across SSE products. The role requires a strategic mindset, technical expertise, and strong collaboration with engineering, sales, marketing, and customer success teams to deliver connectivity solutions that meet customer demands and maintain a competitive edge in the market. The Senior Product Manager will also engage directly with customers, partners, and analysts to understand needs, gather feedback, and shape product strategies. The Senior Product Manager, SSE Threat Protection and Platform Services will be responsible for defining and executing the roadmap for Skyhigh Security s Advanced Threat Protection, URL Filtering, DNS Security, IPS ensuring customer s networks and users are protected from evolving cyber threats by deeply understanding the threat landscape. In addition, the candidate is also responsible to oversee Platform Service capabilities across SSE products. The role requires a strategic mindset, technical expertise, and strong collaboration with engineering, sales, marketing, and customer success teams to deliver connectivity solutions that meet customer demands and maintain a competitive edge in the market. The Senior Product Manager will also engage directly with customers, partners, and analysts to understand needs, gather feedback, and shape product strategies. Key Responsibilities Develop a robust strategy for AntiMalware, URL Filtering, DNS Security, IPS solutions for Skyhigh Security s platform, ensuring users are protected from latest cyber attacks. Gather, nurture product vision and define requirements by working with customers and prospects, market research and competitive analysis for Gateway Anti-Malware, URL Filtering, DNS Security and IPS Collaborate with engineering teams to design and implement protection against Anti-malware, phishing techniques using AI/ML Ensure compatibility across hybrid and cloud deployments, addressing the needs of regulated industries Managing the entire product life cycle from strategic planning to efficient execution and delivery. Ensure compatibility across hybrid and cloud deployments, addressing the needs of regulated industries Managing the entire product life cycle from strategic planning to efficient execution and delivery. Qualifications and Skills 5-10 years of experience as a Product Manager preferably with experience in cyber security.. Strong technical background in cybersecurity with a deep understanding of malware analysis and detection techniques. Requires the ability to understand the SASE / SSE market with a strong grasp of Network Sandboxing, URL filtering Solution, Next-Generation Firewalls, Secure Web Gateway, IPS, DNS Security solution. Strong product management background with a proven track record of leading and delivering cyber security solutions. Ability to collaborate effectively across technical and business teams to achieve common goals. Exceptional communication and organizational skills to manage complex projects and align diverse stakeholders. Strategic thinker with a customer-first approach and a passion for innovation in networking and security. Prefer advanced technology degree or MBA.

Posted 1 month ago

Apply

8.0 - 13.0 years

7 - 11 Lacs

Noida

Work from Office

Years of Experience: 8+ Years Work Mode: Hybrid Overview: Join our team to drive enterprise LLM initiatives, combining expert-level LLM knowledge with strong security and data integration expertise. You will oversee the architecture, development, and deployment of advanced ML solutions that integrate Microsoft Teams, Copilot 365, and Azure data tools delivering secure and scalable outcomes across the organization. Key Responsibilities: Lead architectural design and development for enterprise LLM solutions, emphasizing expert-level proficiency with LLM Agents (highest priority). Ensure robust data security, leveraging strong knowledge of enterprise security practices. Configure and maintain both vector and structured databases, with solid hands-on experience in ETL pipelines. Oversee essential guardrails for LLMs, including basic observability, semantic layers, and model distillation for performance tuning. Coordinate role-based access controls (RBAC) and manage comprehensive data governance frameworks. Communicate complex technical strategies to executives, creating clear diagrams and abstractions. Guide cross-functional teams, balancing moderate risk-taking with innovative solution delivery. Required Skills Experience: Thorough expertise in LLM development and deployment (expert-level). Significant experience applying enterprise security best practices (strong knowledge). Proficiency in vector databases, structured datastores, and ETL workflows (solid experience). Familiarity with LLM observability, guardrails, semantic layers, and model distillation (basic working knowledge). Strong leadership and communication skills, with the ability to present to executives and abstract complex ideas. Comfortable driving projects independently, managing risk, and fostering collaboration. Preferred: Experience integrating Microsoft Teams, Copilot 365, Azure Data Factory/Synapse, and Dynamics 365/Dataverse. Demonstrated success in designing and implementing large-scale ML solutions in enterprise environments.

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Chennai

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines using Databricks Unified Data Analytics Platform.- Design and implement data security and access controls using Databricks Unified Data Analytics Platform.- Troubleshoot and resolve issues related to data platform components using Databricks Unified Data Analytics Platform. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Must To Have Skills: Strong understanding of data modeling and database design principles.- Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure.- Good To Have Skills: Experience with data security and access controls.- Good To Have Skills: Experience with data pipeline development and maintenance. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Chennai office. Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines and ETL processes using Databricks Unified Data Analytics Platform.- Design and implement data security and access controls for the data platform.- Troubleshoot and resolve issues related to the data platform and data pipelines. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Must To Have Skills: Strong understanding of data modeling and database design principles.- Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure.- Good To Have Skills: Experience with data security and access controls.- Good To Have Skills: Experience with data visualization tools such as Tableau or Power BI. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices.- Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Chennai

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines using Databricks Unified Data Analytics Platform.- Design and implement data security and access controls using Databricks Unified Data Analytics Platform.- Troubleshoot and resolve issues related to data platform components using Databricks Unified Data Analytics Platform. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Must To Have Skills: Strong understanding of data platform components and architecture.- Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure.- Good To Have Skills: Experience with data security and access controls.- Good To Have Skills: Experience with data pipeline development and maintenance. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.-This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices.- Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 month ago

Apply

9.0 - 14.0 years

10 - 20 Lacs

Mumbai, Hyderabad, Bengaluru

Work from Office

Skills Required: Basic understanding of infrastructure architecture, industry standards and best practices methodology Highly motivated, ability to lead and influence Experience in infrastructure related role, Proven success in projects in large scale projects Strong technical design skills Ability to effectively communicate, coordinate and collaborate successful track record of effective vendor management Knowledge of emerging technologies and vendor landscape, Ability to balance cost against benefits Understanding of business drivers and all stakeholders requirements Understanding of documentation and frameworks Experience in capacity and performance management, understanding of application lifecycle management Hands on experience and expertise with specific infrastructure technologies relevant to organizational needs, including operating systems software, virtualization, automation (on multiple platforms) Creative thinking, innovative approach for solution design, implementation and problem solving. Set and manage stakeholder expectations. Recommend solutions as per RFP requirements or any change order coming from existing clients. Manage vendor relationships from technical matters perspective. Mediate between infrastructure and delivery / development group. Establish and vet key vendors relationships. Assess emerging technologies from key OEMs Guide sales team on price vs performance issues. Experience: -8- 15 years of overall experience in the field of IT Infrastructure is essential. -Must have at least 3 years experience in designing and implementing products or solutions of any one domain mentioned below: oEnterprise Servers & Storage (Cloud computing / Virtualization / Consolidation / Data center / Business Continuity / Backup / Enterprise Servers , Storage and Tape Technologies, Clustering / High availability etc) oEnterprise Networking and security ( Routing and Switching protocols, network architecture, connectivity options, network management, remote access, data security, standards and compliance, identity management, log management etc ) oDatacenter ( Tier 3 / 3+ DC build including power & cooling) access control and building management system etc) Certifications: Any of the industry standard IT Infrastructure related certifications like RHCE/ MCSE / MCTS/ CCNA / CCIE / VCP /CISSP is essential. Also, PMP or ITIL certification will be an advantage. Job Description: As part of Solution Architecturing Team (SAT), IT Infrastructure Architect will be responsible for design and delivery of end to end IT Infrastructure solutions for clients from across business verticals. Responsibilities will include: -Design of IT infrastructure solutions - develop technology strategy with logical and physical designs to meet client requirements, using standard architecture methodologies. -Handle multiple infrastructure technologies based on project requirements -Preparation of bill of material, technical write-ups for solutions developed -Documentation of architecture design to various levels of details -Work at CxO level executives to capture of client technical requirements /articulate the solution. Detailed briefing with presentations for larger client audience -Work as an individual contributor. -Ensure delivery of the infrastructure solutions designed as per scope & project timelines, through right set of internal/external partners.

Posted 1 month ago

Apply

13.0 - 15.0 years

7 - 8 Lacs

Vijayawada

Work from Office

About Us SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. What s in it for YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI

Posted 1 month ago

Apply

2.0 - 4.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Key Responsibilities: Design, develop, and maintain custom applications on the Zoho platform (Zoho Creator, Zoho CRM, Zoho Books, etc.). Automate business processes using Zoho Deluge scripting. Integrate Zoho apps with third-party applications using REST APIs. Customize Zoho CRM modules, workflows, and dashboards. Collaborate with cross-functional teams to gather requirements and deliver solutions. Troubleshoot and resolve application issues and bugs. Ensure data security and compliance with best practices. Required Skills: Proficiency in Zoho Creator, Zoho CRM, and other Zoho One applications. Experience with Deluge scripting language. Strong understanding of APIs and webhooks. Familiarity with JavaScript, HTML, and CSS. Good problem-solving and analytical skills. Excellent communication and teamwork abilities. Preferred Qualifications: Bachelor s degree in Computer Science, IT, or related field. Experience with MySQL/PostgreSQL. Knowledge of software development life cycle (SDLC) and agile methodologies.

Posted 1 month ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Vadodara

Work from Office

Leads IT security projects in terms of design, plan, and implementation of security infrastructure & solutions including d evelopment and management of overall enterprise security approach in terms of Infrastructure, Network, Data, Cloud and Endpoint Security. Analyse business requirements by partnering with key stakeholders across the organization to develop security solutions. Develop and review security-related documents such as SOPs, Process documents, Operational Reports & Metrics Dashboards, etc. Hands on experience with implementation of various security solutions, such as Cloud Security Solutions, Data Security Solutions, Network Security Solutions & Endpoint Security Solutions. Validate use cases and events configured on SIEM in coordination with SOC Manager. Develop & implement strategies for Infra and Application hardening. Prepare plan and strategies to ensure security of the organization including both high and low risk events. Identify & implement security best practices through fine tuning of appliances, solutions and applying audit recommendations. Well-versed and experienced in threat landscape, risk profiling and continuous improvement in security processes. Work with IT service providers and partners to ensure industry standard platform, network, and endpoint security posture is maintained. Lead vulnerability management and penetration testing activities for Infrastructure, improvise them and ensure closure as per the established practices alongwith analysing, reporting, and tracking of all the identified vulnerabilities. Work in collaboration with internal teams and other business units to identify and highlight security issues and ensure timely closure. Should be able to work under pressure and ensure that timelines are met, and projects and other initiatives are delivered in agreed timelines. Leads on the identification of data security and information protection risks across the organisation and works with stakeholders to develop and implement mitigation plans, escalating issues as appropriate. Help to achieve the highest standards of information security across the organisation. Implements measures to protect digital files and information systems against unauthorized access, modification, or destruction. Develops strategies to respond to and recover from a security breach. Coordinate s security activities with relevant vendors. Working alongside the cross-functional teams & stakeholders in conjunction with Cloud Development, Architecture and DevOps teams to provide visibility of cloud security posture including security of Containers & Serverless environments. Day to day management, troubleshooting and housekeeping of security toolsets. Delivering and maintaining security metrics and improvements. Should have experience in presenting the overall Information security status to CISO with all security metrices for defined KPIs. Planning and implementation of automated remediation activities. Ensuring work is completed in such a way to comply with established compliance and other internal standards.

Posted 1 month ago

Apply

1.0 - 7.0 years

13 - 14 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC & Summary Job Overview We are seeking a highly skilled and versatile polyglot Full Stack Developer with expertise in modern frontend and backend technologies, cloudbased solutions, AI/ML and Gen AI. The ideal candidate will have a strong foundation in fullstack development, cloud platforms (preferably Azure), and handson experience in Gen AI, AI and machine learning technologies. Key Responsibilities Develop and maintain web applications using Angular / React.js , .NET , and Python . Design, deploy, and optimize Azure native PaaS and SaaS services, including but not limited to Function Apps , Service Bus , Storage Accounts , SQL Databases , Key vaults, ADF, Data Bricks and REST APIs with Open API specifications. Implement security best practices for data in transit and rest. Authentication best practices SSO, OAuth 2.0 and Auth0. Utilize Python for developing data processing and advanced AI/ML models using libraries like pandas , NumPy , scikitlearn and Langchain , Llamaindex , Azure OpenAI SDK Leverage Agentic frameworks like Crew AI, Autogen etc. Well versed with RAG and Agentic Architecture. Strong in Design patterns Architectural, Data, Object oriented Leverage azure serverless components to build highly scalable and efficient solutions. Create, integrate, and manage workflows using Power Platform , including Power Automate , Power Pages , and SharePoint . Apply expertise in machine learning , deep learning , and Generative AI to solve complex problems. Primary Skills Proficiency in React.js , .NET , and Python . Strong knowledge of Azure Cloud Services , including serverless architectures and data security. Experience with Python Data Analytics libraries pandas NumPy scikitlearn Matplotlib Seaborn Experience with Python Generative AI Frameworks Langchain LlamaIndex Crew AI AutoGen Familiarity with REST API design , Swagger documentation , and authentication best practices . Secondary Skills Experience with Power Platform tools such as Power Automate, Power Pages, and SharePoint integration. Knowledge of Power BI for data visualization (preferred). Preferred Knowledge Areas Nice to have Indepth understanding of Machine Learning , deep learning, supervised, unsupervised algorithms. Mandatory skill sets AI, ML Preferred skill sets AI, ML Years of experience required 3 7 years Education qualification BE/BTECH, ME/MTECH, MBA, MCA Education Degrees/Field of Study required Bachelor of Technology, Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred Required Skills Game AI Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 28 more} Travel Requirements Government Clearance Required?

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 12 Lacs

Chennai, Bengaluru

Work from Office

What awaits you/ Job Profile Provide estimates for requirements, analyses and develop as per the requirement. Developing and maintaining data pipelines and ETL (Extract, Transform, Load) processes to extract data efficiently and reliably from various sources, transform it into a usable format, and load it into the appropriate data repositories. Creating and maintaining logical and physical data models that align with the organizations data architecture and business needs. This includes defining data schemas, tables, relationships, and indexing strategies for optimal data retrieval and analysis. Collaborating with cross-functional teams and stakeholders to ensure data security, privacy, and compliance with regulations. Collaborate with downstream application to understand their needs and build the data storage and optimize as per their need. Working closely with other stakeholders and Business to understand data requirements and translate them into technical solutions. Familiar with Agile methodologies and have prior experience working with Agile teams using Scrum/Kanban Lead Technical discussions with customers to find the best possible solutions. Proactively identify and implement opportunities to automate tasks and develop reusable frameworks. Optimizing data pipelines to improve performance and cost, while ensuring a high quality of data within the data lake. Monitoring services and jobs for cost and performance, ensuring continual operations of data pipelines, and fixing of defects. Constantly looking for opportunities to optimize data pipelines to improve performance What should you bring along Must Have: Hand on Expertise of 6- 8 years in AWS services like S3, Lambda, Glue, Athena, RDS, Step functions, SNS, SQS, API Gateway, Security, Access and Role permissions, Logging and monitoring Services. Good hand on knowledge on Python, Spark, Hive and Unix, AWS CLI Prior experience in working with streaming solution like Kafka Prior experience in implementing different file storage types like Delta-lake / Ice-berg. Excellent knowledge in Data modeling and Designing ETL pipeline. Must have strong knowledge in using different databases such as MySQL, Oracle and Writing complex queries. Strong experience working in a continuous integration and Deployment process. Nice to Have: Hand on experience in the Terraform, GIT, GIT Actions. CICD pipeline and Amazon Q. Must have technical skill Pyspark, AWS ,SQL, Kafka, Glue, IAM. S3, Lambda, Step Function, Athena Good to have Technical skills Terraform, GIT, GIT Actions. CICD pipeline , AI

Posted 1 month ago

Apply

1.0 - 2.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Huron is redefining what a global consulting organization can be. Advancing new ideas every day to build even stronger clients, individuals and communities. We re helping our clients find new ways to drive growth, enhance business performance and sustain leadership in the markets they serve. And, we re developing strategies and implementing solutions that enable the transformative change they need to own their future. As a member of the Huron corporate team, you ll help to evolve our business model to stay ahead of market forces, industry trends and client needs. Our accounting, finance, human resources, IT, legal, marketing and facilities management professionals work collaboratively to support Huron s collective strategies and enable real transformation to produce sustainable business results. Join our team and create your future. Huron s Corporate Workday team is comprised of business-minded technology professionals responsible for the ongoing optimizing of our portfolio through product strategy, solution delivery, and support. Our team partners closely with our business stakeholders to identify challenges and opportunities to drive efficiencies and create real outcomes for our business. The Product portfolios focus primarily on Workday but also contain integrations, bots, and other complementary solutions. We partner closely with our client-facing counterparts to share best practices and ensure Huron is at the cutting edge of Workday capabilities. The Workday HCM Core Developer will be primarily responsible for analysis, design, and configuration across the Core HCM, Benefits, and Compensation modules within the Workday platform. Requirements Minimum of 1-2 years in configuring and supporting Workday HCM modules such as Core HCM, Compensation, and Benefits. Experience with Workday Reporting, Calculated Fields, EIB builds, schema, and Excel data analysis. Ability to translate business requirements into technical solutions and communicate effectively with stakeholders. Demonstrated ability to work with global HR teams and internal stakeholders to implement system enhancements. Strong analytical skills to troubleshoot and resolve system issues independently. Experience in creating and maintaining documentation for Workday business processes and technical specifications. Preferences Workday HCM Certification is preferred; familiarity with non-HCM modules and Workday Security is a plus. Experience with Agile development processes, including PI Planning and Sprint Reviews. Experience working with global teams and understanding regional HR requirements. Ability to mentor and guide junior team members. Proactive in identifying and implementing system improvements. Strong understanding of data security and compliance standards. Position Level Senior Analyst Country India

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Pune

Work from Office

Your work days are brighter here. At Workday, we value our candidates privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About the Team The Workday Success Plans Team are all about our customers and their post Go-Live journey - we create programs to help them drive business value from their Workday applications. The team is responsible for delivering a variety of programs and services to our customers ranging from feature demonstrations to full feature deployments. At Workday, we help the world s largest organizations adapt to what s next by bringing finance, HR, and planning into a single enterprise cloud. We work hard, and we re serious about what we do. But we like to have fun, too! We put people first, celebrate diversity, drive innovation, and do good in the communities where we live and work. About the Role Would you enjoy learning new things in a fast paced environment? Do you have an appetite for variety and challenging business problems to solve? Are you a great communicator who can clearly articulate and demonstrate the value of Workday solutions to our customers? The Workday Success Plans team works directly with customers through targeted micro consulting engagements to help solve their business needs using the Workday application. Responsible for acting as a trusted Workday advisor, you will have the opportunity to assist customers with how-to questions, troubleshoot and guide customers through configuration, and provide feature demonstrations. As a Workday expert, customers will benefit from your knowledge as you share your experience, identify key considerations, and highlight standard methodologies. Our team of professionals have a broad and deep understanding of Workday, and enjoy the reward of helping customers solve problems, learn about new features, and find greater value in their Workday investment. So if you are passionate about the value technology can bring to an organization, love learning and want to work directly with some of the greatest companies on the planet, bring your energy and teamwork to the Workday Success Plans team! Primary responsibilities of this role include delivering various services to Workday Success Plans customers. To be successful, this requires: Conducting research to ensure understanding of customer questions and related Workday concepts. Delivering small scope consulting in response to customer requests; providing configuration guidance, demonstrations, considerations, tips & tricks. Troubleshooting product configuration to resolve or provide optimal product configuration to meet customer business requirements. Clearly and effectively communicating responses and value to customers. Creating and delivering customer presentations on how to use Workday features to achieve business goals (Accelerator Webinars). Providing one-on-one consulting guidance to accelerate customer feature adoption (Feature Accelerator). Reviewing customer tenants to identify adoption opportunities (Feature Adoption Tenant Reviews). Completing and maintaining product expertise and Workday Certification(s) along with familiarity with Workday roadmap. Keeping up to date with industry practices and the ability to engage with our customers on those topics. Helping drive the creation of new programs to drive customer feature adoption. In addition to delivering Workday Success Plans services, our Workday professionals will also deliver other Customer Enablement services, such as: Office Hours to conduct appointment-based consulting engagements providing guidance and product expertise to customers Perform configuration and business requirements reviews with a detailed deliverable calling out opportunities for optimization Provide one-on-one customer configuration designs Deployments including full deployment of Workday features Ability to travel up to 10% About You Basic Qualifications: 5+ years of IT implementation experience 5+ years of experience with HR and/or Finance systems , including but not limited to Workday, PeopleSoft, SAP, Oracle, and/or JD Edwards 2+ years of software consulting experience Other Qualifications: Workday Certification in HCM and at least one of the following is ideal: Compensation, Talent, Recruiting, Absence, Payroll Ability to gain a thorough understanding of Workday concepts as new features are released Ability to distill complex concepts into understandable presentations for our customers Ability to multitask and work on multiple engagements and deliverables simultaneously Strong critical thinking skills so as to understand complex, technical process issues and facilitate/influence decision making Excellent verbal and written communication skills in English Bachelor s degree or relevant work experience required. Advanced degree preferred Our Approach to Flexible Work With Flex Work, we re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means youll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Coimbatore

Work from Office

We are seeking a skilled Data Security Consultant with expertise in Data Loss Prevention (DLP) as the primary focus. The ideal candidate will also have experience with technologies such as Hardware Security Modules (HSM), Information Rights Management (IRM), Data Classification, and Public Key Infrastructure (PKI). Key Responsibilities: Develop and implement DLP strategies to protect sensitive data across various platforms. Administration of Data Security assets (Plan-Do-Check-Act cycle). Monitor and maintain existing jobs/tasks related to security solutions, including sync, backup, password management, and reporting. Oversee daily, weekly, monthly, and ad-hoc preconfigured notifications, reports, and dashboards. Perform health checks as scheduled; fine-tune and recommend fixes for any issues discovered during checks. Conduct periodic configuration reviews to ensure optimal performance and security. Undertake critical security patch management to keep Data Security Solutions up to date and secure. Onboard new assets, accounts, and user policies into the Data Security Solutions. Collaborate with OEM vendors for issue resolution and follow-up as needed. Assist Level 3 (L3) team members with their roles and responsibilities. Stay updated on industry trends and emerging threats related to data security. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of experience in Data security technologies. Proven experience in DLP technologies and strategies. Strong understanding of HSM, IRM, Data Classification, and PKI. Excellent analytical and problem-solving skills. Relevant certifications (e.g., CISSP, CISM, CISA, product certifications) are a plus.

Posted 1 month ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Database Administrator Project Role Description : Administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. Must have skills : Oracle Database Administration (DBA) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Database Administrator, you will administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Ensure data security and integrity- Optimize database performance- Implement backup and recovery procedures Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Database Administration (DBA)- Strong understanding of database architecture- Experience in database performance tuning- Knowledge of database security best practices- Hands-on experience with database backup and recovery procedures Additional Information:- The candidate should have a minimum of 7.5 years of experience in Oracle Database Administration (DBA).- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Pune

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines and ETL processes using Databricks Unified Data Analytics Platform.- Design and implement data security and access controls for the data platform.- Troubleshoot and resolve issues related to the data platform and data pipelines. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Must To Have Skills: Strong understanding of data modeling and database design principles.- Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure.- Good To Have Skills: Experience with data security and access controls.- Good To Have Skills: Experience with data visualization tools such as Tableau or Power BI. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices.- Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 month ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Pune

Work from Office

Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 month ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Mumbai

Remote

Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Apply to this job As a Data Engineer at Meta, you will shape the future of people-facing and business-facing products we build across our entire family of applications (Facebook, Instagram, Messenger, WhatsApp, Reality Labs, Threads). Your technical skills and analytical mindset will be utilized designing and building some of the worlds most extensive data sets, helping to craft experiences for billions of people and hundreds of millions of businesses worldwide.In this role, you will collaborate with software engineering, data science, and product management teams to design/build scalable data solutions across Meta to optimize growth, strategy, and user experience for our 3 billion plus users, as well as our internal employee community.You will be at the forefront of identifying and solving some of the most interesting data challenges at a scale few companies can match. By joining Meta, you will become part of a world-class data engineering community dedicated to skill development and career growth in data engineering and beyond.Data Engineering: You will guide teams by building optimal data artifacts (including datasets and visualizations) to address key questions. You will refine our systems, design logging solutions, and create scalable data models. Ensuring data security and quality, and with a focus on efficiency, you will suggest architecture and development approaches and data management standards to address complex analytical problems.Product leadership: You will use data to shape product development, identify new opportunities, and tackle upcoming challenges. Youll ensure our products add value for users and businesses, by prioritizing projects, and driving innovative solutions to respond to challenges or opportunities.Communication and influence: You wont simply present data, but tell data-driven stories. You will convince and influence your partners using clear insights and recommendations. You will build credibility through structure and clarity, and be a trusted strategic partner. Data Engineer, Product Analytics Responsibilities Collaborate with engineers, product managers, and data scientists to understand data needs, representing key data insights in a meaningful way Design, build, and launch collections of sophisticated data models and visualizations that support multiple use cases across different products or domains Define and manage Service Level Agreements for all data sets in allocated areas of ownership Solve challenging data integration problems, utilizing optimal Extract, Transform, Load (ETL) patterns, frameworks, query techniques, sourcing from structured and unstructured data sources Improve logging Assist in owning existing processes running in production, optimizing complex code through advanced algorithmic concepts Optimize pipelines, dashboards, frameworks, and systems to facilitate easier development of data artifacts Influence product and cross-functional teams to identify data opportunities to drive impact Minimum Qualifications Bachelors degree in Computer Science, Computer Engineering, relevant technical field, or equivalent 2+ years of experience where the primary responsibility involves working with data. This could include roles such as data analyst, data scientist, data engineer, or similar positions 2+ years of experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala or others.) Preferred Qualifications Masters or Ph.D degree in a STEM field About Meta . Equal Employment Opportunity . Meta is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, fill out the Accommodations request form .

Posted 1 month ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Nagpur

Remote

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 month ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Ahmedabad

Remote

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies