Jobs
Interviews

1581 Data Security Jobs - Page 44

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

3 - 7 Lacs

Pune

Work from Office

Job Description Job Purpose ICE is the leading cloud-based platform provider for the mortgage finance industry. ICE solutions enable lenders to originate more loans, reduce origination costs, and reduce the time to close, all while ensuring the highest levels of compliance, quality, and efficiency. The Team Lead interfaces with the teams to understand the solution functionality and technical requirements. The Team Lead will perform exception reviews on documents at different times in the process of US mortgage. Responsibilities Lead a team of 10-15 Data and Document associate. Oversee the activities of the team and optimize their performance on daily basis. Act as escalation point for the team to overcome any obstacles to service delivery. Team management, Performance Management and production management related processes. Mentor and assist team members in all aspects of their roles and job functions. Proficiency in Identifying Mortgage documents and/or data for re-verification. Provides ideas for process improvements to enhance process efficiency. Understand concepts related to identifying and assessing risks. Engage management to overcome any obstacles to service delivery. Ensure that data security is maintained. Ability to train and mentor document associate. Oversee a team and areas of operations; perform complex work, develops process, procedure and policies Plan and Monitor status of POC, acceptance testing, pre-prod, onboarding testing and smoke test. Manage workflow inconsistencies between associates, teams or shifts. Manage associate level observation and data capture related projects Manage training and development related projects Knowledge and Experience Bachelor s degree or academic equivalent. 5+ years of overall experience with 2 or more years of experience in Team handling / management in the mortgage lending or financial services market. Proficiency in mortgage document terminology. Ability to effectively communicate relevant project information with co-workers, peers, and management through written and verbal communication. Preferred Proficiency in using keyboard shortcuts. People Management. Team handling. Proficiency with Microsoft Office (Excel and Word) and Microsoft Windows Strong attention to detail Excellent time management and organizational skills Ability to work efficiently Ability to work under pressure and time constraints

Posted 1 month ago

Apply

6.0 - 15.0 years

11 - 12 Lacs

Chennai

Work from Office

About Us At SBI Card, the motto Make Life Simple inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. What s in it for YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning development framework Role Purpose Manage channel partners handling tele calling responsibilities for the assigned Collections portfolio. Role Accountability Execute collection strategy for the site, track performance and give inputs to PM Monitor channel partner performance through a structured review mechanism and ensure appropriate capacity planning Review Portfolio both Qualitatively Quantitatively at account/NRR/Region / Channel level in order to identify gaps/issues Strategise and segment portfolio churning on dialer prioritizing at various cuts in terms of Payment due date / Occupation segment / billing cycle / Geography / CM NM / any new product-variants to improve efficiencies and performances Conduct call sampling across portfolio segments in line with the call listening framework and share observations on call and portfolio quality with concerned stakeholders Manage the uptime for all SBI Card collection systems at the vendor site to ensure uninterrupted production and coordinate with Internal stake holders to ensure business continuity in the event of downtime of vendor CRM / Dialer systems /Telecom resources Identify portfolio segments for initiation of legal approaches such as- Mediation, Conciliation, Lok Adalat and arbitration in order to extract on identified accounts and attend all camps as required Ensure Field-referral rates are in line with business-targets by identifying right sets of accounts for TC-retention as well as Field referral, adopting a segmented approach basis past delivery-trends Analyze action codes on daily basis and propose action plan to improve performance and right identification of field referral Ensure necessary training /certifications for tele calling staff in line with compliance requirements Conduct spot audits to ensure adherence to regulatory and internal guidelines on data security in all collection operations at channel partner sites Track performance of all agents on key performance metrics daily to identify any adverse trends either or performance or call-quality or in disciplined behavior e. g. frequent late-logins, uninformed/unplanned leaves, low TOS, non-adherence to compliance guidelines, etc. and take suitable remediation measures Manage agency payouts in line with the business SLA model and ensure billing within defined timelines; Keep track of accuracy in different components of agency payouts--Actual Headcount deployed; PRI Lines, Contest, Team Engagement, etc. Measures of Success Resolution Rate Normalisation Rate Roll back rate KP targets PLI penetration Money collected NFTE productivity Tele Retention rate NFTE training coverage Customer complaints volume Vendor SLA adherence No adverse observations in internal/external audits Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Knowledge of dialer strategies Experience of managing large distributed vendor teams Competencies critical to the role Stakeholder Management Result Orientation Process Orientation Problem Solving Skills Qualification Post-Graduate/Graduate in any discipline Preferred Industry Credit Card

Posted 1 month ago

Apply

9.0 - 14.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Hybrid

JD: Senior Snowflake Data Architect : Design , implements, and optimizes data solutions within the Snowflake cloud data platform, ensuring data security, governance, and performance, while also collaborating with cross-functional teams and providing technical leadership. Data architect include determining a data strategy. understanding data management technologies oversee data inventory. maintain a finger on the pulse of an organization's data management systems.

Posted 1 month ago

Apply

4.0 - 9.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Write complex SQL queries for data extraction, perform exploratory data analysis (EDA) to uncover insights. Strong proficiency in Python and Py Spark for scalable data processing and analytics. Create, transform, and optimize features to enhance model performance. Train, evaluate, and maintain machine learning models in production. Write efficient, maintainable, and version-controlled code that handles large datasets. Regularly update internal teams and clients on project progress, results, and insights. Conduct hypothesis testing and experiment analysis to drive data-driven decisions using AB testing. Scaling machine learning algorithms to work on massive data sets and strict SLAs. Automate operations pipeline which runs on regular intervals to update required datasets. What you'll Bring: A masters or bachelors degree in computer science or a related field from a top university. 4+ years of hands-on experience in Machine Learning (ML) or Data Science with a focus on building scalable solutions. Strong programming expertise in Python and PySpark is must. Proven ability to write highly optimized SQL queries for efficient data extraction and transformation. Experience in feature engineering, inferencing pipelines, and real-time model prediction deployment. Strong fundamentals in applied statistics , with expertise in A/B test design and hypothesis testing. Solid understanding of distributed computing systems and hands-on experience with at least one cloud platform (GCP, AWS, or Azure) Additional Skills Understanding of Git, DevOps, CI / CD, data security, experience in designing on cloud platform. Experienced in automating operations using job scheduler like Airflow Experience in data engineering in Big Data systems

Posted 1 month ago

Apply

8.0 - 13.0 years

6 - 11 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

The Provisioner Team is responsible for the design, development, release and operation of Provisioner, a critical component of our foundational technologies. Provisioner acts as the single source of truth for all user data across all Netskope Apps, scales to 100s of millions of devices at any given time, and processes billions of requests daily. If you are passionate about solving complex problems and developing cloud services that are reliable, performant and scalable, we would like to speak with you. What s in it for you As a member of the Provisioner team you will play a key role in the design, development and ongoing evolution of a critical component of our foundational technologies. You will be responsible for full life-cycle software development, including requirements analysis, technical architecture, design, implementation, testing and documentation, the recipe for deployment to production, and post-production ownership. If you are passionate about solving complex problems and developing cloud services that are reliable, performant and scalable, we would like to speak with you. What you will be doing Design and develop cloud systems and services to handle billions of events. Coordinate with other service development teams, product management and support teams to ensure scalability, supportability and availability for owned services and dependent services. Work on customer issues in a timely manner to improve issue resolution response time and customer satisfaction. Evaluate open source technologies to find the best fit for our needs, and contribute to some of them to meet our unique needs and help the community. Required skills and experience 8+ years of experience in the field of software development. Excellent programming experience in, Python, Node.js , typescript using right data structures and algorithms. we'll versed in design and development of complex large scale distributed systems using technology such as Kafka, Redis, Mongo, MySql, etc,. Experience in development of applications using RESTFul API that includes DB design and management. Experience in scaling and performance optimization of systems including DB query tuning and optimization. Designed and developed cloud microservices that are deployed and used at high scale. Energetic self-starter, with the desire to work in a dynamic fast-paced environment. Excellent verbal and written communication skills Knowledge of Directory services and Identity management solutions is a plus. Education BSCS or equivalent required, MSCS or equivalent strongly preferred

Posted 1 month ago

Apply

10.0 - 15.0 years

13 - 18 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

As a Sr. Staff Engineer on the Data Engineering Team you'll be working on some of the hardest problems in the field of Data, Cloud and Security with a mission to achieve the highest standards of customer success. You will be building blocks of technology that will define Netskope s future. You will leverage open source Technologies around OLAP, OLTP, Streaming, Big Data and ML models. You will help design, and build an end-to-end system to manage the data and infrastructure used to improve security insights for our global customer base. You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Conceiving and building services used by Netskope products to validate, transform, load and perform analytics of large amounts of data using distributed systems with cloud scale and reliability. Helping other teams architect their applications using services from the Data team wile using best practices and sound designs. Evaluating many open source technologies to find the best fit for our needs, and contributing to some of them. Working with the Application Development and Product Management teams to scale their underlying services Providing easy-to-use analytics of usage patterns, anticipating capacity issues and helping with long term planning Learning about and designing large-scale, reliable enterprise services. Working with great people in a fun, collaborative environment. Creating scalable data mining and data analytics frameworks using cutting edge tools and techniques Required skills and experience 10+ years of industry experience building highly scalable distributed Data systems Programming experience in Python, Java or Golang Excellent data structure and algorithm skills Proven good development practices like automated testing, measuring code coverage. Proven experience developing complex Data Platforms and Solutions using Technologies like Kafka, Kubernetes, MySql, Hadoop, Big Query and other open source databases Experience designing and implementing large, fault-tolerant and distributed systems around columnar data stores. Excellent written and verbal communication skills Bonus points for contributions to the open source community Education BSCS or equivalent required, MSCS or equivalent strongly preferred

Posted 1 month ago

Apply

10.0 - 15.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Develop, execute, implement and check methods, plans, toolsets and approaches that are appropriate and compliant to achieve digital solutions for the process intended Identify technical problems and apply/integrate solutions as needed with a set based design approach Analyse customer requirements and define technical solutions as input for proposals to develop proof of concepts Work with cross functional teams and multiple sites during the development process Generate assigned project deliverables, documents, and reports according to the project milestones Support and participate in technical reviews, including the creation and preparation of technical data and presentations as needed and support other engineers with peer to peer reviews Support Lean culture and improvement initiatives in the organisation Take part in regular sprint planning meetings to plan, review and deliver outputs based on agile philosophy Support in creation of training material and knowledge sharing in the relevant area of work Support idea generation and CI activities Perform all activities independently and help other engineers within the program as required. Bachelors in Engineering or higher, with minimum of 10 years of relevant experience Automation & Software development. Design data pipelines to handle large-scale data for training, ensuring data security and compliance with aerospace and defence standards. Excellent experience in shop floor automation and I4.0/IOT Integration Excellent Understanding of Industrial Communication protocols and establishing communication between different Industrial systems. Good knowledge of data structure, data modelling and database architecture Good Knowledge of implementing business process into functional codes Excellent knowledge of software coding , integrated development platforms Proficiency in programming with python, C++,C, C#, Java, .NET, VB, SQL and working knowledge in GIT Ability to conduct POCs and guide team members to extract valuable insights and drive data-driven decision-making. Evaluate and select appropriate tools and applications for tasks. Strong software development skills, including version control (e.g., Git), debugging, testing, and documentation. Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes) is beneficial. Stay updated with latest developments in Automation, Software Development, NLP, ML, AI, LLM technology around the globe relevant to aerospace and defence sector and work along with team to quickly leverage, test and validate new solutions applicable to the working projects by applying cutting edge technologies. Should have strong problem-solving skills and ability to collaborate with cross-functional teams, including domain experts, data scientists, and engineering teams, to gather requirements and translate them into scalable applications. Bachelors in Engineering or higher, with minimum of 10 years of relevant experience Automation & Software development. Design data pipelines to handle large-scale data for training, ensuring data security and compliance with aerospace and defence standards. Excellent experience in shop floor automation and I4.0/IOT Integration Excellent Understanding of Industrial Communication protocols and establishing communication between different Industrial systems. Good knowledge of data structure, data modelling and database architecture Good Knowledge of implementing business process into functional codes Excellent knowledge of software coding , integrated development platforms Proficiency in programming with python, C++,C, C#, Java, .NET, VB, SQL and working knowledge in GIT Ability to conduct POCs and guide team members to extract valuable insights and drive data-driven decision-making. Evaluate and select appropriate tools and applications for tasks. Strong software development skills, including version control (e.g., Git), debugging, testing, and documentation. Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes) is beneficial. Stay updated with latest developments in Automation, Software Development, NLP, ML, AI, LLM technology around the globe relevant to aerospace and defence sector and work along with team to quickly leverage, test and validate new solutions applicable to the working projects by applying cutting edge technologies. Should have strong problem-solving skills and ability to collaborate with cross-functional teams, including domain experts, data scientists, and engineering teams, to gather requirements and translate them into scalable applications.

Posted 1 month ago

Apply

0.0 - 2.0 years

2 - 4 Lacs

Gurugram

Work from Office

Job Overview: We are looking for experienced Data Engineers proficient in Hadoop, Hive, Python, SQL, and Pyspark/Spark to join our dynamic team. Candidates will be responsible for designing, developing, and maintaining scalable big data solutions. Key Responsibilities: Develop and optimize data pipelines for large-scale data processing. Work with structured and unstructured datasets to derive actionable insights. Collaborate with cross-functional teams to enhance data-driven decision-making. Ensure the performance, scalability, and reliability of data architectures. Implement best practices for data security and governance.

Posted 1 month ago

Apply

1.0 - 13.0 years

13 - 14 Lacs

Pune

Work from Office

Design, develop, and maintain scalable data solutions using Starburst. Collaborate with cross-functional teams to integrate Starburst with existing data sources and tools. Optimize query performance and ensure data security and compliance. Implement monitoring and alerting systems for data platform health. Stay updated with the latest developments in data engineering and analytics. Skills Must have Bachelors degree or Masters in a related technical field; or equivalent related professional experience. Prior experience as a Software Engineer applying new engineering principles to improve existing systems including leading complex, well defined projects. Strong knowledge of Big-Data Languages including: SQL Hive Spark/Pyspark Presto Python Strong knowledge of Big-Data Platforms, such as:o The Apache Hadoop ecosystemo AWS EMRo Qubole or Trino/Starburst Good knowledge and experience in cloud platforms such as AWS, GCP, or Azure. Continuous learner with the ability to apply previous experience and knowledge to quickly master new technologies. Demonstrates the ability to select among technology available to implement and solve for need. Able to understand and design moderately complex systems. Understanding of testing and monitoring tools. Ability to test, debug, fix issues within established SLAs. Experience with data visualization tools (e.g., Tableau, Power BI). Understanding of data governance and compliance standards. Nice to have Data Architecture & Engineering: Design and implement efficient and scalable data warehousing solutions using Azure Databricks and Microsoft Fabric. Business Intelligence & Data Visualization: Create insightful Power BI dashboards to help drive business decisions. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Azure AI & ML Engineer Data Science United States of America Alpharetta Senior Azure AI & ML Engineer Data Science United States of America Remote United States Data Engineer with Neo4j Data Science India Chennai Pune, India Req. VR-114886 Data Science BCM Industry 05/06/2025 Req. VR-114886 Apply for Starburst Engineer in Pune *

Posted 1 month ago

Apply

12.0 - 17.0 years

32 - 37 Lacs

Bengaluru

Work from Office

Role Snowflake Architect Job Summary - As a Data Architect, you are core to the D&AI (Data & AI) Practices success. Data is foundational to everything we do, and you are accountable for defining and delivering best-in-class Snowflake data management solutions across all major cloud platforms. This is a senior role with high visibility and reporting to the D&AI Practice Tower Lead. Job Responsibilities Architectural Design: Architect secure, scalable, highly performant data engineering and management solutions, including data warehouses, data lake, ELT / ETL and real-time data engineering / pipeline solutions. Support Principal Data Architect in defining and maintaining Practice reference data engineering and data management architectures. Snowflake Implementation: Design and manage scalable end-to-end data solutions leveraging native Snowflake workloads including : Data Engineering; Data Lake; Data Warehouse; Applications; Unistore; AI/ML; Governed Collaboration, Marketplace, Streamlit. Hyperscaler Design: Competently leverage data-related cloud platform (AWS or Azure) capabilities to architect and develop end-to-end data engineering and data management solutions. Client Engagement: Regular collaboration and partnership with clients to understand their challenges and needs then translate requirements into data solutions that drive customer value. Support proposal development. Data Modeling: Create and maintain conceptual, logical, and physical data models that support both transactional and analytical needs. Ensure data models are optimized for performance and scalability. Creativity: Be an out-of-the-box thinker and passionate about applying your skills to new and existing solutions alike while always demonstrating a customer-first mentality. Mandatory Skills 12+ years hands-on data solution architecture and implementation experience on modern cloud platforms (AWS preferred) including microservice and event-driven architectures. Snowflake SnowPro Advanced Architect certification. An architectural certification on either AWS, Azure or GCP. Hands-on experience with Snowflake capabilities including Snowpipe, Snowpark, Cortex, Polaris Catalog, native applications, Notebooks, Horizon, Marketplace, Streamlit. Practical experience with end-to-end data engineering and data management supporting functions including data modeling (conceptual, logical & physical), BI & analytics, data governance, data quality, data security / privacy / compliance, IAM, performance optimization. Advanced SQL and data profiling. Python, Scala or Java. Strong communication skills with the ability to convey technical concepts to non-technical users. Strong, self-management skills demonstrating ability to multitask and self-manage goals and activities. Additional / Nice-to-have Qualifications- Snowflake SnowPro Advanced Data Engineer certification Snowflake SnowPro Advanced Data Scientist certification Snowflake SnowPro Advanced Administrator certification Snowflake SnowPro Advanced Data Analyst certification Required Education Master or Bachelor (CS, IT, Applied Mathematics or demonstrated experience)

Posted 1 month ago

Apply

3.0 - 8.0 years

50 - 55 Lacs

Bengaluru

Work from Office

Amazon strives to be the worlds most customer-centric company, where customers can research and purchase anything they might want online We set big goals and are looking for people who can help us reach and exceed them The CPT Data Engineering & Analytics (DEA) team builds and maintains critical data infrastructure that enhances seller experience and protects the privacy of Amazon business partners throughout their lifecycle We are looking for a strong Data Engineer to join our team The Data Engineer I will work with well-defined requirements to develop and maintain data pipelines that help internal teams gather required insights for business decisions timely and accurately You will collaborate with a team of Data Scientists, Business Analysts and other Engineers to build solutions that reduce investigation defects and assess the health of our Operations business while ensuring data quality and regulatory compliance The ideal candidate must be passionate about building reliable data infrastructure, detail-oriented, and driven to help protect Amazons customers and business partners They will be an individual contributor who works effectively with guidance from senior team members to successfully implement data solutions The candidate must be proficient in SQL and at least one scripting language (e g Python, Perl, Scala), with strong understanding of data management fundamentals and distributed systems concepts Build and optimize physical data models and data pipelines for simple datasets Write secure, stable, testable, maintainable code with minimal defects Troubleshoot existing datasets and maintain data quality Participate in team design, scoping, and prioritization discussions Document solutions to ensure ease of use and maintainability Handle data in accordance with Amazon policies and security requirements Masters degree in computer science, engineering, analytics, mathematics, statistics, IT or equivalent 3+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Knowledge of distributed systems concepts from data storage and compute perspective Ability to work effectively in a team environment Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Familiarity with big data technologies (Hadoop, Spark, etc ) Knowledge of data security and privacy best practices Strong problem-solving and analytical skills Excellent written and verbal communication skills

Posted 1 month ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad, Ahmedabad, Bengaluru

Work from Office

Platform Engineer Bangalore, Karnataka, India AXA XL are creating a new delivery model based on agile and a new vendor model to enable more efficient delivery of technology with the business fully embedded in what we deliver This Digital Factory will first deliver new capabilities for Fusion Transformation and then will be rolled out to other capabilities within GT (Global Technology) As a Platform Engineer, your role is crucial in maintaining and optimizing the reliability, availability, and performance of our Salesforce platform Youll collaborate with cross-functional teams to ensure a seamless experience for users and contribute to the overall success of our business processes What you ll be DOING What will your essential responsibilities include? Customize and configure the Salesforce platform to enhance its reliability, incorporating features like custom objects, fields, workflows, and validation rules Manage user accounts, profiles, and permission sets, ensuring optimal access levels and security settings Import, export, and maintain data within Salesforce, conducting cleanup and deduplication when necessary Experience in Salesforce administration, including user account maintenance, security settings, and workflow rules Experience in application support, including triaging incidents and assisting support teams in resolving issues Proven experience in optimizing the Salesforce platform for performance, security, and audit Experience in understanding and implementing Salesforce AI capabilities will be an added advantage Hands on experience in Implementing Integrations with various systems Support on SSO, certificates, backup & recovery, and key encryption issues Monitor transactions and performance with Dynatrace, including event monitoring, notifications, and logging errors Support for pen testing requirements Onboard all privileged users to CyberArk and non-privileged users to Aveksa for regular user access management Rotate service/integration account credentials every 60 days and store them in CyberArk Safe Conduct a proof of concept on using Azure KeyVault as an alternative to CyberArk to automate credential rotation Perform non-release dependent changes based on requirements and approvals Implement findings of Salesforce Optimizer to enhance performance and security of the platform Implement and maintain automated deployment pipelines for Salesforce applications using CI/CD tools Ensure smooth and efficient deployment processes across different Salesforce environments Develop and maintain automation processes using Salesforce Process Builder, Workflow Rules, and Flow to streamline business processes Create and maintain reports and dashboards to offer insights into sales, marketing, and customer service performance Collaborate with different departments to understand their requirements and provide Salesforce solutions Provide user support, answer questions, and offer training to ensure effective platform utilization Work with the IT team to integrate Salesforce with other business systems and applications Maintain data security and compliance with relevant regulations and company policies Keep detailed records of configuration changes, customizations, and user guides for the Salesforce platform Excellent understanding of Salesforce, including experience with Sales Cloud, Service Cloud, or other Salesforce products Salesforce Administrator certification is often required or highly recommended Proficiency in using Salesforce tools and features, with familiarity in data management, database concepts, and reporting In-depth knowledge of Salesforce architecture, components, and deployment mechanisms Understanding of Salesforce security features and best practices Ability to analyze complex business processes and design solutions within Salesforce Excellent communication skills for collaboration, understanding requirements, and providing effective training and support Project management skills for handling Salesforce implementations, upgrades, and integrations Attention to detail to maintain data accuracy and system integrity Ability to troubleshoot and resolve technical issues, collaborating with Salesforce support when necessary Collaborative attitude to work with different teams and departments Stay up-to-date with new Salesforce features and best practices Must have: Salesforce Admin /Developer background, Dynatrace, Commvault, Sheild Encryption, Azure Key Vault, SSO, CyberArk You will report to the Platform Lead What you will BRING We re looking for someone who has these abilities and skills: Required Skills and Abilities: Excellent understanding of Salesforce, including experience with Sales Cloud, Service Cloud, or other Salesforce products Salesforce Administrator certification is often required or highly recommended Proficiency in using Salesforce tools and features, with familiarity in data management, database concepts, and reporting Ability to analyze complex business processes and design solutions within Salesforce Excellent communication skills for collaboration, understanding requirements, and providing effective training and support Desired Skills and Abilities: Adaptability to new/different strategies, programs, technologies, practices, cultures, etc; comfortable with change, able to make transitions easily Effective communication skills, both verbal and written Proven ability to clearly articulate goals and desired outcomes and influence key decisions to ensure deliverables are met Proven ability to establish and maintain effective relationships and leverage those relationships to deliver on goals Ability to effectively integrate colleagues and teams that are currently disparate, introducing new technologies and processes Proven planning and organization skills, creating work schedules, prioritizing workload, preparing in advance, and setting realistic timescales Bachelor s degree or equivalent work experience

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Gurugram

Work from Office

Senior Specialist, BA/DA Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities This data should not only be high quality, but also actionable - enabling AXA XL s executive leadership team to maximize benefits and facilitate sustained dynamic advantage Our Innovation, Data & Analytics function is focused on driving innovation through optimizing how we leverage data to drive strategy and differentiate ourselves from the competition As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Senior Specialist for our Data Sourcing & Solutions team The role sits across the Innovation, Data & Analytics Department to ensure customer requirements are properly captured and transformed into actionable data specifications Success in the role will require focus on proactive management of the sourcing and management of data from source through usage What you ll be DOING What will your essential responsibilities include? Identify, evaluate, and acquire various data sources that align with the customer needs This may involve collaborating with business stakeholders, third party vendors and source system teams Design and implement data integration strategies to combine diverse datasets from internal and external sources This would include accountable for documenting data requirements to the ETL processes, APIs, and data pipelines Develop data solutions to address specific business challenges This might involve creating custom data models that provide actionable insights which integrate with the existing data assets Oversee the organization and management of data within databases, ensuring data security, integrity, and accessibility Able to work in Agile framework by defining and prioritizing the product backlog, collaborating with agile teams to deliver the business goals and customer needs Work closely with cross-functional teams, including Data Engineers, Data Science, Data Management, Data Governance, Data Quality, BI Solutions and Stakeholders Implement measures to maintain data accuracy, consistency, and completeness Perform data validation and cleansing as needed Adhere to data governance standards, ensuring compliance with regulations and internal policies related to data usage and privacy Proficiency in various data technologies such as SQL, Azure cloud technologies, Databricks to analyze and produce data insights Stay updated with emerging technologies in data management Developing expertise in the Insurance domain to better understand the context of the data Identify data related issues, troubleshoot problems, and recommend solutions to enhance data sourcing and integration processes Provide guidance and mentorship to junior analysts and team members, fostering a culture of continuous learning and improvement Translate complex technical concepts into understandable insights for non-technical stakeholders to drive data-informed decision-making Explore innovative approaches to data acquisition, integration, and solution development that can lead to improved efficiency and effectiveness It is pivotal in ensuring that an organization s data ecosystem is robust, well-integrated, and capable of providing accurate and actionable insights to become a data driven organization Instill a customer-first attitude, prioritizing service for our business stakeholders above all else You will report to Lead Specialist, Data Sourcing & Solutions What you will BRING We re looking for someone who has these abilities and skills: Required Skills and Abilities: Extensive experience in a data role (business analyst, data analyst, analytics) preferably in the Insurance industry and within a data division Excellent presentation, communication (oral & written), and relationship building skills, across all levels of management and customer interaction Excellent SQL knowledge, exposure to Azure cloud technologies (including Databricks) and technical ability to query AXA XL data sources to understand our data Deep insurance experience in data, underwriting, claims and/or operations, including influencing, collaborating, and leading efforts in complex, disparate and inter-related teams with competing priorities Passion for data and experience working within a data driven organization Integrate internal data with external industry data to deliver holistic solutions Work with unstructured data to unlock information needed by the business to create unique products for the insurance industry Possesses excellent exploratory analysis skills and high intellectual curiosity Displays exceptional organizational skills and is detail oriented Effective conceptual thinker who connects dots, critical thinking, and analytical skills Ability to take ownership, work under pressure, and meet deadlines Ability to work with team members across the globe and across departments Desired Skills and Abilities: Builds trust and rapport within and across groups Applies in-depth knowledge of business and specialized areas to solve business problems and understand integration challenges and long-term impact creatively and strategically Ability to manage data needs of an individual project(s) while being able to understand the broader enterprise data perspective Expected to recommend innovation and improvement to policies, procedures, deploying resources and performing core activities

Posted 1 month ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad, Ahmedabad, Bengaluru

Work from Office

Senior Platform Engineer Bangalore, Karnataka, India AXA XL are creating a new delivery model based on agile and a new vendor model to enable more efficient delivery of technology with the business fully embedded in what we deliver This Digital Factory will first deliver new capabilities for Fusion Transformation and then will be rolled out to other capabilities within GT (Global Technology) As a Platform Engineer, your role is crucial in maintaining and optimizing the reliability, availability, and performance of our Salesforce platform Youll collaborate with cross-functional teams to ensure a seamless experience for users and contribute to the overall success of our business processes What you ll be DOING What will your essential responsibilities include? Customize and configure the Salesforce platform to enhance its reliability, incorporating features like custom objects, fields, workflows, and validation rules Manage user accounts, profiles, and permission sets, ensuring optimal access levels and security settings Import, export, and maintain data within Salesforce, conducting cleanup and deduplication when necessary Experience in Salesforce administration, including user account maintenance, security settings, and workflow rules Experience in application support, including triaging incidents and assisting support teams in resolving issues Proven experience in optimizing the Salesforce platform for performance, security, and audit Experience in understanding and implementing Salesforce AI capabilities will be an added advantage Hands on experience in Implementing Integrations with various systems Support on SSO, certificates, backup & recovery, and key encryption issues Monitor transactions and performance with Dynatrace, including event monitoring, notifications, and logging errors Support for pen testing requirements Onboard all privileged users to CyberArk and non-privileged users to Aveksa for regular user access management Rotate service/integration account credentials every 60 days and store them in CyberArk Safe Conduct a proof of concept on using Azure KeyVault as an alternative to CyberArk to automate credential rotation Perform non-release dependent changes based on requirements and approvals Implement findings of Salesforce Optimizer to enhance performance and security of the platform Implement and maintain automated deployment pipelines for Salesforce applications using CI/CD tools Ensure smooth and efficient deployment processes across different Salesforce environments Develop and maintain automation processes using Salesforce Process Builder, Workflow Rules, and Flow to streamline business processes Create and maintain reports and dashboards to offer insights into sales, marketing, and customer service performance Collaborate with different departments to understand their requirements and provide Salesforce solutions Provide user support, answer questions, and offer training to ensure effective platform utilization Work with the IT team to integrate Salesforce with other business systems and applications Maintain data security and compliance with relevant regulations and company policies Keep detailed records of configuration changes, customizations, and user guides for the Salesforce platform Excellent understanding of Salesforce, including experience with Sales Cloud, Service Cloud, or other Salesforce products Salesforce Administrator certification is often required or highly recommended Proficiency in using Salesforce tools and features, with familiarity in data management, database concepts, and reporting In-depth knowledge of Salesforce architecture, components, and deployment mechanisms Understanding of Salesforce security features and best practices Ability to analyze complex business processes and design solutions within Salesforce Excellent communication skills for collaboration, understanding requirements, and providing effective training and support Project management skills for handling Salesforce implementations, upgrades, and integrations Attention to detail to maintain data accuracy and system integrity Ability to troubleshoot and resolve technical issues, collaborating with Salesforce support when necessary Collaborative attitude to work with different teams and departments Stay up-to-date with new Salesforce features and best practices Must have : Salesforce Admin /Developer background , Dynatrace, Commvault, Shield Encryption, Azure Key Vault , SSO, CyberArk You will report to the Platform Lead What you will BRING We re looking for someone who has these abilities and skills: Required Skills and Abilities: Excellent understanding of Salesforce, including experience with Sales Cloud, Service Cloud, or other Salesforce products Salesforce Administrator certification is often required or highly recommended Proficiency in using Salesforce tools and features, with familiarity in data management, database concepts, and reporting Ability to analyze complex business processes and design solutions within Salesforce Excellent communication skills for collaboration, understanding requirements, and providing effective training and support Desired Skills and Abilities: Adaptability to new/different strategies, programs, technologies, practices, cultures, etc; comfortable with change, able to make transitions easily Effective communication skills, both verbal and written Proven ability to clearly articulate goals and desired outcomes and influence key decisions to ensure deliverables are met Proven ability to establish and maintain effective relationships and leverage those relationships to deliver on goals Ability to effectively integrate colleagues and teams that are currently disparate, introducing new technologies and processes Proven planning and organization skills, creating work schedules, prioritizing workload, preparing in advance, and setting realistic timescales Bachelor s degree or equivalent work experience

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Noida

Work from Office

Key Responsibilities: Design, build, and maintain robust and scalable data pipelines to support analytics and reporting needs. Manage and optimize data lake architectures, with a focus on Apache Atlas for metadata management, data lineage, and governance. Integrate and curate data from multiple structured and unstructured sources to enable advanced analytics. Collaborate with data scientists and business analysts to ensure availability of clean, well-structured data. Implement data quality, validation, and monitoring processes across data pipelines. Develop and manage Power BI datasets and data models, supporting dashboard and report creation. Support data cataloging and classification using Apache Atlas for enterprise-wide discoverability and compliance. Ensure adherence to data security, privacy, and compliance policies.

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

About Us What s in it for YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of recovery portfolio (all vintages) performance through a team of Vendor Managers, Team Leaders & Tele calling Agents. Role Accountability Formulate strategies for low performing segments/markets and implement the same with vendor sites Guide vendors to design a competitive incentive plan to ensure performance improvement Review Portfolio both Qualitatively & Quantitatively at account/NRR/Region/Channel level in order to identify gaps/issues/red flags and design solutions to fix root causes and highlight the same to relevant teams Share initial advisory with Strategy team on quality of overall portfolio or a particular segment thereof Liason with Strategy and Dialer teams to design appropriate call-service campaigns to cater to the changing needs in the portfolio with a view to ensure overall improvement in the portfolio performances Identify portfolio segments for initiation of legal approaches such as- Mediation, Conciliation, Lok Adalat and Arbitration in order to extract on identified accounts and attend all camps as required Monitor channel partner performance through a structured review mechanism and ensure appropriate capacity planning and portfolio balancing amongst various channel partners Review all vendor productivity metrics to ensure accuracy in vendor payouts Ensure adherence to BCP guidelines and DR drill schedules across all channel partner sites Ensure necessary training /certifications for tele calling staff in line with compliance requirements Stay abreast of any sudden actions (taken by regulator/Govt/any entity) which may have impact on portfolio performance and update concerned stakeholders in a timely manner Ensure adherence to cost targets in tele recovery operations Scan the market for industry best practices and analyze internal processes to identify and recommend enhancement opportunities Conduct spot audits to ensure adherence to regulatory and internal guidelines on data security in all recovery operations at channel partner sites Ensure necessary training /certifications for tele calling staff in line with compliance requirements Measures of Success Rate of Recovery (ROR) Money Recovered FTE/NFTE productivity PLI Penetration Waiver Targets Tele Retention Rate FTE/NFTE training coverage Budget adherence in tele calling operations Customer complaints volume Cost reduction as per MOU Vendor SLA Adherence No adverse observations in internal/external audits Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of recovery Processes Knowledge of dialer strategies Experience of managing large distributed vendor teams Competencies critical to the role Stakeholder Management Result Orientation Analytical Ability Process Orientation Market Awareness Problem Solving Skills Qualification Post-Graduate/Graduate Degree in any discipline Preferred Industry FSI

Posted 1 month ago

Apply

8.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Role Overview: As a Agile Program Manager for a Skyhigh Product Group, you will act as a central coordination point that can provide a consolidated and holistic view of program delivery progress across a Product Group. You will ensure clear and transparent communication with all stakeholders and your passion and enthusiasm for organization and attention to detail will enable you to fully execute the planning, facilitation and communication of complex deliveries. You will bring best practice Agile delivery improvements to the team, embedding a data driven approach to driving a continuous improvement culture. Role Overview As an Agile Program Manager for a Skyhigh Product Group, you will act as a central coordination point that can provide a consolidated and holistic view of program delivery progress across a Product Group. You will ensure clear and transparent communication with all stakeholders and your passion and enthusiasm for organization and attention to detail will enable you to fully execute the planning, facilitation and communication of complex deliveries. You will bring best practice Agile delivery improvements to the team, embedding a data driven approach to driving a continuous improvement culture. In this role: The Agile Program Manager is part of a team of program managers that operate across the various product groups that together make up Skyhigh Security s portfolio of products. Role details: Program Leadership: Work with Senior leadership to ensure that the Product Domain and program goals are aligned with the companys strategic vision Lead the end-to-end planning, driving accountability in teams towards delivery of major initiatives within the product domain Define the program milestones and success criteria in alignment with OKRs Plan, facilitate & communicate across product domains to provide a holistic, consolidated Product Group delivery with transparent progress information at the portfolio level. This includes: Proactively identifying and managing major dependencies related to departments outside of engineering, particularly in relation to New Product Introduction items. Collaborating with teams across product management, engineering, design, marketing, sales and customer success to ensure alignment and seamless delivery execution. Owning and delivering all reporting, including to executive stakeholders on program progress, RAID and milestones. Fostering a clear and effective communication approach so all Product Group portfolio information is readily available Coordinating annual & quarterly portfolio planning Proactively identify, assess and mitigate Product Group-level risks Deliver & execute all initiative tracking, including workforce allocation against business defined goals and budget guardrails, and value tracking for limited availability releases and recent GA release. You will also: Ensure Jira can deliver consistent portfolio-level reports, while enforcing adherence within the teams for the collection of core data Identify key dependencies across the product group and the wider portfolio,, ensuring these are picked up and owned by the appropriate Engineering Manager. Seek out continuous improvement by working alongside other Program Managers to drive a common approach to portfolio management for process, tools & people. You ll establish portfolio execution KPIs at the Product Group Level, while seeking out ways to drive improvement initiatives to improve those KPIs. Provide coaching and development to the teams related to agile delivery best practices. General Background and Experience required for a Program Manager: 8-10+ years of agile program management experience Engineering Product Domains At least 3+ years managing complex Engineering initiatives for a Product Group, which comprises multiple product domains. Experience working with distributed Engineering teams across time zones, in a global organization. Extensive expertise of agile program management discipline and methodologies. Demonstrated ability to facilitate, lead, organize and motivate matrix teams while working across team dependencies to achieve Program results within defined project milestones and identified timelines. Excellent time management, communication (written, verbal), and organization skills across multiple levels and functional areas, with a strong ability to cohesively synthesize data and key points for both internal and executive consumption. Excellent knowledge of change management methodology. Tools: Proficiency in Agile Program Management tools e.g. Jira, Confluence It would be great if you also have the following, but they are not required : PMP certification Agile Certification

Posted 1 month ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Job Summary: Are you passionate about data management and unifying complex datasets? Do you have a track record of leading successful data unification projects? Reltio is seeking a dynamic and experienced Senior Manager to join our Data Unification Team in India. As a Senior Manager, you will play a pivotal role in driving the development and execution of data unification strategies and initiatives, ensuring high-quality and accurate data for our clients. Job Duties and Responsibilities: Leadership: Provide strong leadership and guidance to the Data Unification Team, driving a culture of excellence, innovation, and collaboration. Data Unification Strategy: Develop and implement data unification strategies, frameworks, and best practices to deliver effective data management solutions. Team Management: Lead, mentor, and inspire a team of data engineers and analysts, fostering their professional growth and ensuring the teams success in meeting project goals. Data Governance: Define and enforce data governance policies, standards, and procedures to ensure data quality, integrity, and security across all data unification activities. Project Management: Oversee end-to-end project management, including scoping, planning, resource allocation, and execution of data unification projects, ensuring timely delivery within budget and scope. Stakeholder Collaboration: Collaborate with cross-functional teams, including Product Management, Engineering, and Customer Success, to align data unification initiatives with overall business objectives and customer requirements. Continuous Improvement: Identify areas for process improvement, automation, and optimization, driving efficiency and scalability in data unification operations. Industry Trends: Stay updated with industry trends, emerging technologies, and best practices in data management and unification, leveraging this knowledge to drive innovation and enhance our offerings. Skills You Must Have: Minimum of 9+ years of experience in data management, data unification, or related fields, with a focus on managing large-scale data projects. Strong leadership and managerial skills, with a proven track record of successfully leading and motivating high-performing teams. In-depth knowledge of data unification methodologies, tools, and technologies, including Master Data Management (MDM) and data integration techniques. Solid understanding of data governance principles, data quality frameworks, and data security best practices. Excellent project management skills, with the ability to manage multiple projects simultaneously, prioritize tasks, and meet deadlines. Strong analytical and problem-solving abilities, with the capacity to analyze complex data sets, identify patterns, and propose innovative solutions. Effective communication and stakeholder management skills, with the ability to collaborate and influence cross-functional teams and senior leadership. Bachelors or Masters degree in Computer Science, Information Systems, or a related field.

Posted 1 month ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Pune

Work from Office

[{"Salary":"30Lacs - 34.99Lacs" , "Remote_Job":false , "Posting_Title":"Technical Architect Apps / Cloud" , "Is_Locked":false , "City":"Pune City","Industry":"IT Services","Job_Description":" Job Title: Lead Software Engineer \u2013 Cloud Job Code: Reports to: Technical Architect FLSA: Exempt Department: IT-Development Revision Date: 18-03-2024 BASIC PURPOSE: Lead Software Engineer - Cloud, who will be responsible for the plan, design, as well as deployment automation of platform solutions on AWS. Instrumental in profiling and improving front-end and back-end application performance, mentor team members and take end to end technical ownership of applications. Must be able to stay on top of technology changes in the market and continuously look for opportunities to leverage new technology. ESSENTIAL FUNCTIONS: Design, build and implement performant and robust cloud platform solutions. Design and build data pipelines for supporting analytical solutions. Provide level of effort estimates to support planning activities. Provide microservices architecture and design specifications. Fix defects found during implementation process or reported by the software test team. Support software process definition and improvement initiatives and release process working with DevOps team in CI/CD pipelines developed with Terraform and CDK as Infrastructure-as-Code. Execute security architectures for cloud systems. Understand and recognize the quality consequences which may occur from the improper performance of their specific job; has awareness of system defects that may occur in their area of responsibility, including product design, verification, and validation, and testing activities. Mentor less experienced team members. Collaborate with Product Designers, Product Managers, Architect and Software Engineers to deliver compelling user-facing products. REPORTING RELATIONSHIPS : Reports to Technical Architect QUALIFICATIONS: Bachelors degree in Computer Science / related engineering field OR equivalent experience in related field. 10+ years of experience in cloud application development. Expert proficiency in JavaScript / Typescript and/or Java with Spring Boot or Quarkus. Experience in architecting and developing event driven cloud-based solutions. Experience in AWS services including API Gateway, AppSync, Amplify, S3, CloudFront, Lambda, ECS/Fargate, Step Functions, SQS, Event Bridge, Cognito, Dynamo, Aurora PostgreSQL, OpenSearch/Elasticsearch, AWS Pinpoint. Extensive experience in developing applications in POSIX compliant environments. Strong knowledge of containerization, with expert knowledge of either Docker or Kubernetes. Proficient in IAM security and AWS Networking. Expert understand of building and working with CI/CD pipelines. Experience in designing, developing and creating data pipelines, data warehouse applications and analytical solutions including machine learning. Deep cloud domain expertise in: architecture, big data, microservice architectures, cloud technologies, data security and privacy, tools, and testing Excellent programming skills in data pipeline technologies like Lambda, Kinesis, S3, EventBridge and MSK Extensive experience with Service Oriented Architecture, microservices, virtualization and working with relational databases and non-relational databases. Excellent knowledge of building big data solutions using NoSQL databases. Experience with secure coding best practices and methodologies, vulnerability scans, threat modeling, and cyber-risk assessments. Familiar with modern build pipelines and tools Ability to understand business requirements and translate them into technical designs Familiarity with Git code versioning tools Good written, verbal communication skills Great team player PREFERRED SKILLS: Experience with RDBMS and is a plus Experience in Java, .NET, Python is a plus Experience in big data solutions and analytics; using BI tools like Power BI or AWS QuickSight is a plus Experience with other cloud computing platforms Azure or AWS Certification such as a Solutions Architect Expert, Azure Fundamentals, data scientist, developer, etc.

Posted 1 month ago

Apply

12.0 - 17.0 years

20 - 25 Lacs

Noida

Work from Office

Position Summary Overall 12+ years of quality engineering experience with DWH/ETL for enterprise grade applications Hands on experience with functional, non-functional and automation of products Hands on experience with leverage LLMs/GenAI for improving efficiency & effectiveness of overall delivery process Job Responsibilities Leading end-to-end QE for product suite Authoring QE test strategy for a release and executing it for a release Driving quality releases by closely working with development, PMs, DevOps, support and business teams Achieving automation coverage for product suite with good line coverage Manage risks and resolves issues that affect release scope, schedule and quality Work with product teams to understand impacts of branches and code merges, etc. Lead and co-ordinate the release activities including the execution of overall Ability to lead team of SDETs and help them in addressing their issues Mentoring and coaching members in the team Education BE/B.Tech Master of Computer Application Work Experience Overall 12+ years of strong hands-on experience with DWH/ETL for enterprise grade applications Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Lifescience Knowledge AWS Data Pipeline Azure Data Factory Data Governance Data Modelling Data Privacy Data Security Data Validation Testing Tools Data Visualisation Databricks Snowflake Amazon Redshift MS SQL Server Performance Testing

Posted 1 month ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Work from Office

Role & responsibilities Driving Sales,Selling Cloud Cross products of CISCO,Palo Alto, Veritas Product line Virtual and Cloud Servers, Backup Services, Managing hosting Services, Peripherals, , Veritas, Application Security, Backup solution,data security, end point security, hyper converged infrastructure, data security, PAM, access management Antivirus and Security Services- Drive revenue and market share in Enterprise, SMB, Hospitals, BFSI, Startup, and other industry for India (Regional Market) Having Direct Sales Experience Work with Partners, OEM, and Distributors to extend reach and drive operations Proactively sourced and developed new business from internal and external referral networks. Driving Sales initiatives to achieve business goals and manage the stiff target Preferred candidate profile Having exp in relevant field in domestic market

Posted 1 month ago

Apply

10.0 - 14.0 years

17 - 30 Lacs

Hyderabad, Mumbai (All Areas)

Work from Office

Requirement: Sales experience in the local territory should be in range of 10 years experience. Experience in selling IT Infrastructure products (Cisco and other OEMs), including Enterprise Networking, Cyber Security, Collaboration, and Data-Centre. Experience in selling Software and Cloud Services products (Cisco and other OEMs). Quick learner and the ability to showcase our offerings compellingly. Confident personality and should be a good team player. Excellent communication, interpersonal, problem-solving, presentation, and organizational skills. Personal integrity in commercial dealing and working Job Responsibilities: Strong understanding of the Sales processes relating to IT Products, Software, and Services Excellence at lead generation, adding new customers, building customer relationships, cross-selling, up-selling, negotiations, and closing deals. Ensure a Customer-First approach and create a service level differentiator by quick turnarounds and any-time availability. Self-confidence in accepting and achieving Gross Margin Targets through skilful negotiations with the Customers and the OEMs Proficiency and diligence in working with Sales Management Software Systems & CRM, and keep updates with commercial, technical, completion, issues, etc.

Posted 1 month ago

Apply

1.0 - 6.0 years

4 - 7 Lacs

Hyderabad

Work from Office

ABOUT THE ROLE Role Description: As part of the cybersecurity organization, the Data Engineer is responsible for designing, building, and maintaining data infrastructure to support data-driven decision-making. This role involves working with large datasets, developing reports, executing data governance initiatives, and ensuring data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture, ETL processes, and cybersecurity data frameworks. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing. Be a key team member that assists in design and development of the data pipeline. Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems. Schedule and manage workflows the ensure pipelines run on schedule and are monitored for failures. Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. Implement data security and privacy measures to protect sensitive data. Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Collaborate and communicate effectively with product teams. Collaborate with data scientists to develop pipelines that meet dynamic business needs. Share and discuss findings with team members practicing SAFe Agile delivery model. Functional Skills: Basic Qualifications: Masters degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Hands on experience with data practices, technologies, and platforms, such as Databricks, Python, Gitlab, LucidChart,etc. Proficiency in data analysis tools (e.g. SQL) and experience with data sourcing tools Excellent problem-solving skills and the ability to work with large, complex datasets Understanding of data governance frameworks, tools, and best practices Knowledge of and experience with data standards (FAIR) and protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, cloud data platforms Experience working in Product team's environment Experience working in an Agile environment Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Initiative to explore alternate technology and approaches to solving problems Skilled in breaking down problems, documenting problem statements, and estimating efforts Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals

Posted 1 month ago

Apply

9.0 - 12.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Role Description: We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications 9 to 12 years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 1 month ago

Apply

5.0 - 9.0 years

7 - 17 Lacs

Pune

Work from Office

Job Overview: Diacto is seeking an experienced and highly skilled Data Architect to lead the design and development of scalable and efficient data solutions. The ideal candidate will have strong expertise in Azure Databricks, Snowflake (with DBT, GitHub, Airflow), and Google BigQuery. This is a full-time, on-site role based out of our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design, build, and optimize robust data architecture frameworks for large-scale enterprise solutions Architect and manage cloud-based data platforms using Azure Databricks, Snowflake, and BigQuery Define and implement best practices for data modeling, integration, governance, and security Collaborate with engineering and analytics teams to ensure data solutions meet business needs Lead development using tools such as DBT, Airflow, and GitHub for orchestration and version control Troubleshoot data issues and ensure system performance, reliability, and scalability Guide and mentor junior data engineers and developers Experience and Skills Required: 5 to12 years of experience in data architecture, engineering, or analytics roles Hands-on expertise in Databricks , especially Azure Databricks Proficient in Snowflake , with working knowledge of DBT, Airflow, and GitHub Experience with Google BigQuery and cloud-native data processing workflows Strong knowledge of modern data architecture, data lakes, warehousing, and ETL pipelines Excellent problem-solving, communication, and analytical skills Nice to Have: Certifications in Azure, Snowflake, or GCP Experience with containerization (Docker/Kubernetes) Exposure to real-time data streaming and event-driven architecture Why Join Diacto Technologies? Collaborate with experienced data professionals and work on high-impact projects Exposure to a variety of industries and enterprise data ecosystems Competitive compensation, learning opportunities, and an innovation-driven culture Work from our collaborative office space in Baner, Pune How to Apply: Option 1 (Preferred) Copy and paste the following link on your browser and submit your application for the automated interview process : - https://app.candidhr.ai/app/candidate/gAAAAABoRrTQoMsfqaoNwTxsE_qwWYcpcRyYJk7NzSUmO3LKb6rM-8FcU58CUPYQKc65n66feHor-TGdCEfyouj0NmKdgYcNbA==/ Option 2 1. Please visit our website's career section at https://www.diacto.com/careers/ 2. Scroll down to the " Who are we looking for ?" section 3. Find the listing for " Data Architect (Data Bricks) " and 4. Proceed with the virtual interview by clicking on " Apply Now ."

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies