Home
Jobs
Companies
Resume

668 Normalization Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: SQL Developer Trainee Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data & Engineering Job Summary: We are seeking a detail-oriented and motivated SQL Developer Trainee to join our team remotely. This internship is designed for recent graduates or students who want to gain practical experience in database development, writing SQL queries, and working with data in real-world applications. Key Responsibilities: Write, test, and optimize SQL queries for data extraction and reporting Assist in designing and maintaining database structures (tables, views, indexes, etc.) Help ensure data integrity, accuracy, and security across systems Support the team in troubleshooting and debugging database-related issues Collaborate with developers and analysts to fulfill data requirements for projects Document query logic and database-related processes Qualifications: Bachelor’s degree (or final year student) in Computer Science, Information Technology, or related field Strong understanding of SQL and relational databases (e.g., MySQL, PostgreSQL, SQL Server) Familiarity with database design and normalization Analytical mindset with good problem-solving skills Ability to work independently in a remote setting Eagerness to learn and grow in a data-driven environment Preferred Skills (Nice to Have): Experience with procedures, triggers, or functions in SQL Exposure to BI/reporting tools (Power BI, Tableau, etc.) Understanding of data warehousing concepts Familiarity with cloud-based databases or platforms What We Offer: Monthly stipend of ₹25,000 Remote work opportunity Hands-on experience with real-world datasets and projects Mentorship and structured learning sessions Certificate of Completion Potential for full-time employment based on performance Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

Linkedin logo

Job Description It is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Summary Database Engineer/ Developer - Core Skills Proficiency in SQL and relational database management systems like PostgreSQL or MySQL, along with database design principles. Strong familiarity with Python for scripting and data manipulation tasks, with additional knowledge of Python OOP being advantageous. A good understanding of data security measures and compliance is also required. Demonstrated problem-solving skills with a focus on optimizing database performance and automating data import processes, and knowledge of cloud-based databases like AWS RDS and Google BigQuery. Min 5 years of experience. JD Database Engineer - Data Research Engineering Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Linkedin logo

Role: Database Engineer Location : Remote Skills and Experience ● Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. ● Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. ● Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. ● Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. ● Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). ● Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. ● Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. ● Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. ● Knowledge of SQL and understanding of database design principles, normalization, and indexing. ● Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. ● Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. ● Eagerness to develop import workflows and scripts to automate data import processes. ● Knowledge of data security best practices, including access controls, encryption, and compliance standards. ● Strong problem-solving and analytical skills with attention to detail. ● Creative and critical thinking. ● Strong willingness to learn and expand knowledge in data engineering. ● Familiarity with Agile development methodologies is a plus. ● Experience with version control systems, such as Git, for collaborative development. ● Ability to thrive in a fast-paced environment with rapidly changing priorities. ● Ability to work collaboratively in a team environment. ● Good and effective communication skills. ● Comfortable with autonomy and ability to work independently. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

Remote

Linkedin logo

Role : Database Engineer Location : Remote Notice Period : 30 Days Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Key Responsibilities · Design, write, and optimize complex SQL queries, stored procedures, views, and triggers in MS SQL Server. · Collaborate with business analysts and application developers to gather data requirements and implement solutions. · Work within the iPROOF LCNC platform to develop, configure, and deploy business processes and user interfaces. · Document solutions, data structures, and workflows effectively. Required Skills · Strong knowledge of MS SQL Server, including T-SQL, indexing, and performance tuning. · Familiarity with stored procedures, functions, and triggers. · Understanding of relational database design and normalization. · Willingness to learn and work on the iPROOF LCNC platform. · Basic understanding of JSON data structures and its use in dynamic systems. · Good communication and documentation skills. Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Sales and Distribution (SD) Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute to key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the effort to design, build, and configure applications - Act as the primary point of contact for the project - Manage the team and ensure successful project delivery - Collaborate with multiple teams to make key decisions - Provide solutions to problems for the immediate team and across multiple teams Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Sales and Distribution (SD) - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 5 years of experience in SAP Sales and Distribution (SD) - This position is based in Mumbai - A 15 years full-time education is required 15 years full time education Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

Location Bangalore, Karnataka, 560100 Category Engineering / Information Technology Job Type Full time Job Id 1184886 No NoSQL Developer This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are: Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description: What you'll do: HPE Operations is our innovative IT services organization. It provides the expertise to advise, integrate, and accelerate our customers’ outcomes from their digital transformation. Our teams collaborate to transform insight into innovation. In today’s fast paced, hybrid IT world, being at business speed means overcoming IT complexity to match the speed of actions to the speed of opportunities. Deploy the right technology to respond quickly to market possibilities. Join us and redefine what’s next for you. Scope of Work for NoSQL Database Administrators 1. Database Design and Architecture: Collaborate with developers and architects to design and implement efficient database schemas. Ensure proper normalization & indexing, for optimal performance and scalability. Evaluate and implement replication architectures as needed for high availability and fault tolerance. 2. Performance Tuning: Monitor database performance using tools like NoSQL Enterprise Monitor or custom scripts. Identify and optimize poorly performing queries through query analysis, index optimization, and query rewriting. Configure NoSQL server settings, buffer pools, and caches to maximize throughput and minimize response times. 3. Security and Compliance: Configure role-based access controls (RBAC) and auditing features to ensure data integrity and confidentiality. Coordinate with UIDAI-appointed GRCP and security audit agencies to conduct regular security audits and share artifacts to address any identified risks promptly. 4. Backup and Disaster Recovery: Ensure integration of databases to suitable backup mechanisms and recovery procedures to safeguard against data loss and ensure business continuity. Coordinate with relevant teams to conduct regular DR drills. 5. Monitoring and Alerting: Set up monitoring systems to track database health and performance metrics. Configure automated alerts to notify administrators of critical issues, such as performance degradation, replication lags, or storage constraints. Proactively investigate and resolve alerts to maintain system stability and availability. 6. Capacity Planning: Monitor database growth trends and resource utilization to forecast future capacity requirements. Evaluate and recommend hardware upgrades or version upgrades to support long-term scalability goals. 7. Maintenance and Upgrades: Perform routine maintenance tasks, including database backups, index rebuilds, and statistics updates, during scheduled maintenance windows. Support execution of NoSQL version upgrades and patch deployments, ensuring compatibility and minimal downtime. Coordinate with application teams to test and validate database changes in development and staging environments before production rollout. 8. Documentation and Knowledge Sharing: Maintain comprehensive documentation of database configurations and other Standard Operating Procedures Provide training and knowledge transfer to team members on database administration best practices, tools, and technologies. Foster a culture of continuous learning and improvement through regular team meetings and knowledge sharing sessions. 9. Incident Response and Problem Resolution: Respond to database-related incidents and outages promptly, following established incident management procedures. Jointly work with relevant teams to carry out root cause analysis to identify underlying issues and support the implementation of corrective actions to prevent recurrence. Collaborate with cross-functional teams, including developers, network engineers, and system administrators, to troubleshoot complex issues and drive resolution. 10. Service Ticket Handling for DML Operations: Receive and prioritize service tickets related to Data Manipulation Language (DML) operations, including INSERT, UPDATE, DELETE, and SELECT queries. Analyze and troubleshoot reported issues, such as data inconsistency, performance degradation, or query optimization. Work closely with application developers and end-users to understand the context and requirements of DML operations. Provide guidance and recommendations to developers & architects on optimizing DML queries for improved performance and efficiency. Implement database schema changes, data migrations, and data transformations as requested through service tickets, ensuring proper testing and validation procedures are followed by the development team. Communicate updates and resolutions to stakeholders in a timely and transparent manner, ensuring customer satisfaction and alignment with service level agreements. Collaborate with other teams, such as application support, quality assurance, and release management, to address cross-functional dependencies and ensure smooth execution of DML-related tasks. 12. Collaboration with Developers for DDL Operations: Assist developers/ application architects in planning and executing Data Definition Language (DDL) operations, such as creating, altering, and dropping database objects (tables, indexes, views, etc.). Review proposed schema changes and provide recommendations on best practices for database design and optimization in consultation with application architects and developers Perform impact analysis to assess the potential implications of DDL changes on existing data, applications, and performance. Execute DDL changes during scheduled maintenance windows, following change management procedures and ensuring minimal disruption to production systems. 13. Data Archival and Cleanup: Collaborate with application architects and developers to define data retention policies and archival strategies for each schema/table based on UIDAI data retention policies and business needs. Develop and implement data archival processes to move inactive or historical data to secondary storage or archival databases, freeing up space and improving database performance. Monitor data growth trends and implement proactive measures, such as purging, to manage database size and mitigate performance degradation, in consultation with application architects and developers. Document data archival and cleanup procedures, including retention periods, criteria for data selection, and execution schedules, ensuring compliance with data governance policies. What you need to bring: Qualification – BE / BTech/MCA/ MSc Min years of TOTAL experience – 4+ years Location – Bengaluru, UIDAI Onsite deployment Nature/Key activities – DBA related activities Additional Skills: Accountability, Accountability, Action Planning, Active Learning (Inactive), Active Listening, Bias, Business Growth, Business Planning, Cloud Computing, Cloud Migrations, Coaching, Commercial Acumen, Creativity, Critical Thinking, Cross-Functional Teamwork, Customer Experience Strategy, Data Analysis Management, Data Collection Management (Inactive), Data Controls, Design Thinking, Empathy, Follow-Through, Growth Mindset, Hybrid Clouds, Infrastructure as a Service (IaaS) {+ 10 more} What We Can Offer You: Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected: Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #operations Job: Services Job Level: TCP_02 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.

Posted 5 days ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

Location Bangalore, Karnataka, 560100 Category Engineering / Information Technology Job Type Full time Job Id 1184887 No NoSQL Specialist This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are: Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description: What you'll do: HPE Operations is our innovative IT services organization. It provides the expertise to advise, integrate, and accelerate our customers’ outcomes from their digital transformation. Our teams collaborate to transform insight into innovation. In today’s fast paced, hybrid IT world, being at business speed means overcoming IT complexity to match the speed of actions to the speed of opportunities. Deploy the right technology to respond quickly to market possibilities. Join us and redefine what’s next for you. Scope of Work for NoSQL Database Administrators 1. Database Design and Architecture: Collaborate with developers and architects to design and implement efficient database schemas. Ensure proper normalization & indexing, for optimal performance and scalability. Evaluate and implement replication architectures as needed for high availability and fault tolerance. 2. Performance Tuning: Monitor database performance using tools like NoSQL Enterprise Monitor or custom scripts. Identify and optimize poorly performing queries through query analysis, index optimization, and query rewriting. Configure NoSQL server settings, buffer pools, and caches to maximize throughput and minimize response times. 3. Security and Compliance: Configure role-based access controls (RBAC) and auditing features to ensure data integrity and confidentiality. Coordinate with UIDAI-appointed GRCP and security audit agencies to conduct regular security audits and share artifacts to address any identified risks promptly. 4. Backup and Disaster Recovery: Ensure integration of databases to suitable backup mechanisms and recovery procedures to safeguard against data loss and ensure business continuity. Coordinate with relevant teams to conduct regular DR drills. 5. Monitoring and Alerting: Set up monitoring systems to track database health and performance metrics. Configure automated alerts to notify administrators of critical issues, such as performance degradation, replication lags, or storage constraints. Proactively investigate and resolve alerts to maintain system stability and availability. 6. Capacity Planning: Monitor database growth trends and resource utilization to forecast future capacity requirements. Evaluate and recommend hardware upgrades or version upgrades to support long-term scalability goals. 7. Maintenance and Upgrades: Perform routine maintenance tasks, including database backups, index rebuilds, and statistics updates, during scheduled maintenance windows. Support execution of NoSQL version upgrades and patch deployments, ensuring compatibility and minimal downtime. Coordinate with application teams to test and validate database changes in development and staging environments before production rollout. 8. Documentation and Knowledge Sharing: Maintain comprehensive documentation of database configurations and other Standard Operating Procedures Provide training and knowledge transfer to team members on database administration best practices, tools, and technologies. Foster a culture of continuous learning and improvement through regular team meetings and knowledge sharing sessions. 9. Incident Response and Problem Resolution: Respond to database-related incidents and outages promptly, following established incident management procedures. Jointly work with relevant teams to carry out root cause analysis to identify underlying issues and support the implementation of corrective actions to prevent recurrence. Collaborate with cross-functional teams, including developers, network engineers, and system administrators, to troubleshoot complex issues and drive resolution. 10. Service Ticket Handling for DML Operations: Receive and prioritize service tickets related to Data Manipulation Language (DML) operations, including INSERT, UPDATE, DELETE, and SELECT queries. Analyze and troubleshoot reported issues, such as data inconsistency, performance degradation, or query optimization. Work closely with application developers and end-users to understand the context and requirements of DML operations. Provide guidance and recommendations to developers & architects on optimizing DML queries for improved performance and efficiency. Implement database schema changes, data migrations, and data transformations as requested through service tickets, ensuring proper testing and validation procedures are followed by the development team. Communicate updates and resolutions to stakeholders in a timely and transparent manner, ensuring customer satisfaction and alignment with service level agreements. Collaborate with other teams, such as application support, quality assurance, and release management, to address cross-functional dependencies and ensure smooth execution of DML-related tasks. 12. Collaboration with Developers for DDL Operations: Assist developers/ application architects in planning and executing Data Definition Language (DDL) operations, such as creating, altering, and dropping database objects (tables, indexes, views, etc.). Review proposed schema changes and provide recommendations on best practices for database design and optimization in consultation with application architects and developers Perform impact analysis to assess the potential implications of DDL changes on existing data, applications, and performance. Execute DDL changes during scheduled maintenance windows, following change management procedures and ensuring minimal disruption to production systems. 13. Data Archival and Cleanup: Collaborate with application architects and developers to define data retention policies and archival strategies for each schema/table based on UIDAI data retention policies and business needs. Develop and implement data archival processes to move inactive or historical data to secondary storage or archival databases, freeing up space and improving database performance. Monitor data growth trends and implement proactive measures, such as purging, to manage database size and mitigate performance degradation, in consultation with application architects and developers. Document data archival and cleanup procedures, including retention periods, criteria for data selection, and execution schedules, ensuring compliance with data governance policies. What you need to bring: Qualification – BE / BTech/MCA/ MSc Min years of TOTAL experience – 4+ years Location – Bengaluru, UIDAI Onsite deployment Nature/Key activities – DBA related activities Additional Skills: Accountability, Accountability, Active Learning (Inactive), Active Listening, Bias, Business Growth, Client Expectations Management, Coaching, Creativity, Critical Thinking, Cross-Functional Teamwork, Customer Centric Solutions, Customer Relationship Management (CRM), Design Thinking, Empathy, Follow-Through, Growth Mindset, Information Technology (IT) Infrastructure, Infrastructure as a Service (IaaS), Intellectual Curiosity (Inactive), Long Term Planning, Managing Ambiguity, Process Improvements, Product Services, Relationship Building {+ 5 more} What We Can Offer You: Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected: Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #operations Job: Services Job Level: TCP_03 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.

Posted 5 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

IMEA (India, Middle East, Africa) India LIXIL INDIA PVT LTD Employee Assignment Fully remote possible Full Time 1 May 2025 Title Senior Data Engineer Job Description A Data Engineer is responsible for designing, building, and maintaining large-scale data systems and infrastructure. Their primary goal is to ensure that data is properly collected, stored, processed, and retrieved to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities Design and Develop Data Pipelines: Create data pipelines to extract data from various sources, transform it into a standardized format, and load it into a centralized data repository. Build and Maintain Data Infrastructure: Design, implement, and manage data warehouses, data lakes, and other data storage solutions. Ensure Data Quality and Integrity: Develop data validation, cleansing, and normalization processes to ensure data accuracy and consistency. Collaborate with Data Analysts and Business Process Owners: Work with data analysts and business process owners to understand their data requirements and provide data support for their projects. Optimize Data Systems for Performance: Continuously monitor and optimize data systems for performance, scalability, and reliability. Develop and Maintain Data Governance Policies: Create and enforce data governance policies to ensure data security, compliance, and regulatory requirements. Experience & Skills Hands-on experience in implementing, supporting, and administering modern cloud-based data solutions (Google BigQuery, AWS Redshift, Azure Synapse, Snowflake, etc.). Strong programming skills in SQL, Java, and Python. Experience in configuring and managing data pipelines using Apache Airflow, Informatica, Talend, SAP BODS or API-based extraction. Expertise in real-time data processing frameworks. Strong understanding of Git and CI/CD for automated deployment and version control. Experience with Infrastructure-as-Code tools like Terraform for cloud resource management. Good stakeholder management skills to collaborate effectively across teams. Solid understanding of SAP ERP data and processes to integrate enterprise data sources. Exposure to data visualization and front-end tools (Tableau, Looker, etc.). Strong command of English with excellent communication skills. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

IMEA (India, Middle East, Africa) India LIXIL INDIA PVT LTD Employee Assignment Fully remote possible Full Time 1 May 2025 Title Data Engineer Job Description A Data Engineer is responsible for designing, building, and maintaining large-scale data systems and infrastructure. Their primary goal is to ensure that data is properly collected, stored, processed, and retrieved to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities Design and Develop Data Pipelines: Create data pipelines to extract data from various sources, transform it into a standardized format, and load it into a centralized data repository. Build and Maintain Data Infrastructure: Design, implement, and manage data warehouses, data lakes, and other data storage solutions. Ensure Data Quality and Integrity: Develop data validation, cleansing, and normalization processes to ensure data accuracy and consistency. Collaborate with Data Analysts and Business Process Owners: Work with data analysts and business process owners to understand their data requirements and provide data support for their projects. Optimize Data Systems for Performance: Continuously monitor and optimize data systems for performance, scalability, and reliability. Develop and Maintain Data Governance Policies: Create and enforce data governance policies to ensure data security, compliance, and regulatory requirements. Experience & Skills Hands-on experience in implementing, supporting, and administering modern cloud-based data solutions (Google BigQuery, AWS Redshift, Azure Synapse, Snowflake, etc.). Strong programming skills in SQL, Java, and Python. Experience in configuring and managing data pipelines using Apache Airflow, Informatica, Talend, SAP BODS or API-based extraction. Expertise in real-time data processing frameworks. Strong understanding of Git and CI/CD for automated deployment and version control. Experience with Infrastructure-as-Code tools like Terraform for cloud resource management. Good stakeholder management skills to collaborate effectively across teams. Solid understanding of SAP ERP data and processes to integrate enterprise data sources. Exposure to data visualization and front-end tools (Tableau, Looker, etc.). Strong command of English with excellent communication skills. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Lucknow, Uttar Pradesh, India

On-site

Linkedin logo

About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Gruve Gruve is an innovative software services startup dedicated to transforming enterprises to AI powerhouses. We specialize in cybersecurity, customer experience, cloud infrastructure, and advanced technologies such as Large Language Models (LLMs). Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks. About The Role We are looking for a highly skilled SIEM Consultant with deep hands-on experience in designing, implementing, and configuring Splunk SIEM solutions. The ideal candidate will be responsible for deploying Splunk into customer environments, onboarding diverse log sources, configuring security use cases, and integrating external tools for end-to-end threat visibility. This role demands strong technical expertise, project delivery experience, and the ability to translate security monitoring requirements into Splunk configurations and dashboards. Key Responsibilities SIEM Design s Implementation Lead the design and deployment of Splunk architecture (single/multi-site, indexer clustering, search head clustering, ). Define data ingestion strategies and architecture best Install, configure, and optimize Splunk components (forwarders, indexers, heavy forwarders, search heads, deployment servers). Set up and manage Splunk deployment servers, apps, and configuration bundles. Log Source Onboarding Identify, prioritize, and onboard critical log sources across IT, cloud, network, security, and application Develop onboarding playbooks for common and custom log Create parsing, indexing, and field extraction logic using conf, transforms.conf, and custom apps. Ensure log data is normalized and categorized according to CIM (Common Information Model). Use Case Development s Configuration Work with SOC teams to define security monitoring requirements and detection Configure security use cases, correlation rules, and alerting within Splunk Enterprise Security (ES) or core Develop dashboards, alerts, and scheduled reports to support threat detection, compliance, and operational Tune and optimize correlation rules to reduce false Tool Integration Integrate Splunk with third-party tools and platforms such as: Ticketing systems (ServiceNow, JIRA) Threat Intelligence Platforms (Anomali) SOAR platforms (Splunk SOAR, Palo Alto XSOAR) Endpoint C Network tools (CrowdStrike, Fortinet, Cisco, ) Develop and manage APIs, scripted inputs, and custom connectors for data ingestion and bidirectional Documentation s Handover Maintain comprehensive documentation for architecture, configurations, onboarding steps, and operational Conduct knowledge transfer and operational training for security Create runbooks, SOPs, and configuration backups for business Prepare HLD and LLD documents for Solution Required Skills s Experience 5+ years of experience in SIEM implementation, with at least 3 years focused on Strong knowledge of Splunk architecture, deployment methods, data onboarding, and advanced search. Experience in building Splunk dashboards, alerts, and use case logic using SPL (Search Processing Language). Familiarity with Common Information Model (CIM) and data normalization Experience integrating Splunk with external tools and writing automation scripts (Python, Bash, ). Preferred Certifications Splunk Core Certified Power User Splunk Certified Admin or Architect Splunk Enterprise Security Certified Admin (preferred) Security certifications like CompTIA Security+, GCIA, or CISSP (optional but beneficial) Why Gruve At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you. Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted. Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Service Management Practitioner Project Role Description : Support the delivery of programs, projects or managed services. Coordinate projects through contract management and shared service coordination. Develop and maintain relationships with key stakeholders and sponsors to ensure high levels of commitment and enable strategic agenda. Must have skills : Microsoft Power Business Intelligence (BI) Good to have skills : Microsoft Power Apps Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Service Management Practitioner, you will support the delivery of programs, projects, or managed services. Coordinate projects through contract management and shared service coordination. Develop and maintain relationships with key stakeholders and sponsors to ensure high levels of commitment and enable strategic agenda. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Coordinate the delivery of programs, projects, or managed services. - Develop and maintain relationships with key stakeholders and sponsors. - Ensure high levels of commitment from stakeholders. - Enable strategic agenda through effective coordination. - Provide regular updates and reports on project progress. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Business Intelligence (BI). - Good To Have Skills: Experience with Microsoft Power Apps. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Microsoft Power Business Intelligence (BI). - This position is based at our Chennai office. - A 15 years full-time education is required. 15 years full time education Show more Show less

Posted 5 days ago

Apply

12.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM Cognos TM1 Good to have skills : NA Minimum 12 Year(s) Of Experience Is Required Educational Qualification : Must Complete 15 years of full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the successful implementation of applications and collaborating with various teams to deliver high-quality solutions. Your typical day will involve designing and developing applications, troubleshooting issues, and contributing to key decisions to enhance application functionality and performance. Roles & Responsibilities: - Expected to be an SME in IBM Cognos TM1 - Collaborate and manage the team to perform effectively - Responsible for team decisions and ensuring successful application implementation - Engage with multiple teams and contribute to key decisions - Expected to provide solutions to problems that apply across multiple teams - Design, build, and configure applications based on business process and requirements - Troubleshoot and resolve application issues - Contribute to key decisions to enhance application functionality and performance Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM Cognos TM1 - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on experience implementing various machine learning algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 12 years of experience in IBM Cognos TM1 - This position is based in Coimbatore - Must complete 15 years of full-time education Show more Show less

Posted 5 days ago

Apply

1.0 - 3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Gruve Gruve is an innovative software services startup dedicated to transforming enterprises to AI powerhouses. We specialize in cybersecurity, customer experience, cloud infrastructure, and advanced technologies such as Large Language Models (LLMs). Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks. About The Role We are seeking a skilled SIEM Administrator to manage and optimize different SIEM solutions. The ideal candidate will be responsible for system administration, log integration, troubleshooting, Deployment, Implementation and maintaining security posture for the organization. Key Responsibilities SIEM Administration: Install, configure, maintain, and upgrade SIEM components. (IBM Qradar SIEM, DNIF, Splunk & Securonix). Log Management Onboard, parse, and normalize logs from various data sources (firewalls, servers, databases, applications, etc.) Custom log source integration and parser development. System Monitoring & Troubleshooting Ensure SIEM tools are functioning optimally. Monitor & regular health check perform for SIEM tools. troubleshoot system errors and resolve performance issues. Conduct regular performance tuning and capacity planning Perform root cause analysis for system failures & performance issues. Optimize system performance and storage management for SIEM Integration & Automation Integrate third-party security tools (firewalls, EDR, threat intelligence feeds) with SIEM. Compliance & Audits Ensure log retention policies comply with regulatory standards. Develop & enforce SIEM access controls & user roles/permissions. Documentation & Training Document system configurations, SOP’s & troubleshooting documents. Prepare monthly/ weekly reports and PPT, onboarding documentation as per business/ client requirement. Dashboard & Report Development Create & maintain custom dashboards & reports Optimize searches & reports for performance and efficiency. Other Knowledge Base Hands on experience with Linux OS & Windows OS Basic to mediator level knowledge in networking skills Should be familiar with Azure, AWS or GCP products Required Skills & Qualifications B.E/B.Tech degree in computer science, Cybersecurity, or related field (preferred). 1-3 years experience as Soc Admin Strong knowledge of SIEM architecture, log sources, and event correlation. Proficiency in log management, regular expressions, and network security concepts. Experience integrating SIEM with various security tools (firewalls, IDS/IPS, antivirus, etc.). Scripting knowledge (Python, Bash, or PowerShell) is a plus. Training or Certificate on Splunk or IBM Qradar Preferred. Soft Skills Strong analytical and problem-solving skills. Excellent communication and documentation abilities. Ability to work independently and in a team. Must Have Skills Hands-on experience with SIEM tools like IBM QRadar, Splunk, Securonix, LogRhythm, Microsoft Sentinel, DNIF etc. Proficiency in IBM Qradar & Splunk administration Configuring, maintaining, and troubleshooting SIEM solutions. Log source integration, parsing, and normalization. Strong knowledge of TCP/IP, DNS, HTTP, SMTP, FTP, VPNs, proxies, and firewall rules. Familiarity with Linux and Windows system administration. Why Gruve At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you. Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted. Show more Show less

Posted 5 days ago

Apply

7.5 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Full Stack Engineer Project Role Description : Responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. Use development skills to deliver innovative solutions that help our clients improve the services they provide. Leverage new technologies that can be applied to solve challenging business problems with a cloud first and agile mindset. Must have skills : Java Full Stack Development, Node.js Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : BE Summary: As a Full Stack Engineer, you will be responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. You will use your development skills to deliver innovative solutions that help our clients improve the services they provide. Additionally, you will leverage new technologies to solve challenging business problems with a cloud-first and agile mindset. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Develop and engineer end-to-end features of a system. - Deliver innovative solutions to improve client services. - Utilize development skills to solve challenging business problems. - Stay updated with new technologies and apply them to projects. Professional & Technical Skills: - Must To Have Skills: Proficiency in Java Full Stack Development, Apache Kafka. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Java Full Stack Development. - This position is based at our Bengaluru office. - A BE degree is required. Show more Show less

Posted 5 days ago

Apply

7.5 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Business Analyst Project Role Description : Analyze an organization and design its processes and systems, assessing the business model and its integration with technology. Assess current state, identify customer requirements, and define the future state and/or business solution. Research, gather and synthesize information. Must have skills : Data Analytics Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Business Analyst, you will analyze an organization and design its processes and systems, assessing the business model and its integration with technology. You will assess the current state, identify customer requirements, and define the future state and/or business solution. Research, gather, and synthesize information to drive business decisions. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Lead process improvement initiatives to enhance efficiency. - Conduct data analysis to identify trends and insights. - Develop business cases and recommendations based on data analysis. - Facilitate communication between business stakeholders and technical teams. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Analytics. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Data Analytics. - This position is based at our Hyderabad office. - A 15 years full-time education is required. Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Service Management Practitioner Project Role Description : Support the delivery of programs, projects or managed services. Coordinate projects through contract management and shared service coordination. Develop and maintain relationships with key stakeholders and sponsors to ensure high levels of commitment and enable strategic agenda. Must have skills : Microsoft Power Business Intelligence (BI) Good to have skills : Microsoft Power Apps Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Service Management Practitioner, you will support the delivery of programs, projects, or managed services. Coordinate projects through contract management and shared service coordination. Develop and maintain relationships with key stakeholders and sponsors to ensure high levels of commitment and enable strategic agenda. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Coordinate the delivery of programs, projects, or managed services. - Develop and maintain relationships with key stakeholders and sponsors. - Ensure high levels of commitment from stakeholders. - Enable strategic agenda through effective coordination. - Provide regular updates and reports on project progress. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Business Intelligence (BI). - Good To Have Skills: Experience with Microsoft Power Apps. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Microsoft Power Business Intelligence (BI). - This position is based at our Chennai office. - A 15 years full-time education is required. Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Sales and Distribution (SD) Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute to key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the effort to design, build, and configure applications - Act as the primary point of contact for the project - Manage the team and ensure successful project delivery - Collaborate with multiple teams to make key decisions - Provide solutions to problems for the immediate team and across multiple teams Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Sales and Distribution (SD) - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 5 years of experience in SAP Sales and Distribution (SD) - This position is based in Mumbai - A 15 years full-time education is required Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Sonipat, Haryana, India

On-site

Linkedin logo

ABOUT US: Newton School and Rishihood University have formed a powerful partnership to drive transformation in the world of technology and education. Newton School, dedicated to bridging the employability gap, has partnered with Rishihood University, India's first impact university. Together, we will be revolutionizing education, empowering students, and shaping the future of technology. With a team of experienced professionals and renowned investors, we are united in our mission to solve the employability challenge and make a lasting impact on society. Job Summary Are you passionate about computer science? Join us as a Subject Matter Expert in the Computer Science department at Sonipat, Delhi NCR. We are seeking an experienced professional to deliver high-quality lectures, design course content, mentor students, and take lab classes, ensuring their success in the tech field. Key Responsibilities Develop course materials and curriculum. Collaborate with team members to improve the learning experience. Provide guidance and support to students in understanding the subjects or the programming languages. Take ownership of labs and guide students in creating and developing projects. Qualifications M.Tech in Computer Science or related field. Experience in teaching or mentoring students is preferred. Excellent communication and presentation skills. Experience with industry-standard tools and technologies. Experience with DSA OR MERN, OR DBMS, any one or more are acceptable. Requirements Strong expertise in topics related to In-depth knowledge of Database Management Systems - Relational Database Management Systems (RDBMS), Querying in SQL, Normalization, Indexing, Transactions, Query Optimization, Data Modeling, Database Design, ACID properties, NoSQL Databases. Preferred frontend technologies: HTML, CSS, JavaScript, React.js. Strong expertise in topics related to Advanced Data Structures and Algorithms -Arrays, Linked Lists, Stacks, Queues, Trees, Graphs, Sorting Algorithms, Searching Algorithms, Dynamic Programming, Algorithm Analysis, Recursion Any of the above-mentioned technologies is accepted. Key Responsibility Areas Course Development and Planning Owning and running labs Cross-Functional Team Collaboration Owning up to Academic Success Tutoring and Student Support Stakeholder Management Willingness to work in Sonipat, Delhi NCR. Perks And Benefits Market Competitive Salaries Research Opportunities and industry collaborations. Inculcate research and innovation in students, and help Rishihood University to do cutting-edge work in the computer science department. State-of-the-Art Facilities in Labs and Classrooms. Show more Show less

Posted 5 days ago

Apply

15.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Project Role : Technology Delivery Lead Project Role Description : Manages the delivery of large, complex technology projects using appropriate frameworks and collaborating with sponsors to manage scope and risk. Drives profitability and continued success by managing service quality and cost and leading delivery. Proactively support sales through innovative solutions and delivery excellence. Must have skills : SAP FI S/4HANA Accounting Good to have skills : NA Minimum 15 Year(s) Of Experience Is Required Educational Qualification : Any Degree Summary: As a Technology Delivery Lead, you will manage the delivery of large, complex technology projects using appropriate frameworks and collaborating with sponsors to manage scope and risk. You will drive profitability and continued success by managing service quality and cost and leading delivery. Additionally, you will proactively support sales through innovative solutions and delivery excellence. Roles & Responsibilities: - Expected to be a SME with deep knowledge and experience. - Should have Influencing and Advisory skills. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Manage the delivery of large, complex technology projects using appropriate frameworks. - Collaborate with sponsors to manage scope and risk. - Drive profitability and continued success by managing service quality and cost. - Lead delivery and ensure delivery excellence. - Proactively support sales through innovative solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP FI S/4HANA Accounting. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 15 years of experience in SAP FI S/4HANA Accounting. - This position is based at our Hyderabad office. - A Any Degree is required. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Design, implement, and optimize low-level systems for large-scale web data collection. You will work beneath abstraction layers to develop highly efficient, adaptive scraping techniques that can navigate anti-bot protections, manipulate network protocols, and modify browser behavior at a deep level. Contract positions available We are looking for someone with hands-on experience in browser instrumentation, network security, and kernel-level systems , with a focus on performance, resilience, and scalability. Key Responsibilities Architect and maintain a high-performance, scalable web scraping infrastructure . Implement custom browser instrumentation using Chrome DevTools Protocol (CDP), WebDriver Wire Protocol, and related technologies. Develop low-level networking solutions , including TCP/IP socket programming, TLS handshake customization, and QUIC protocol manipulation. Reverse-engineer and bypass anti-bot detection mechanisms (e.g., Cloudflare, Akamai) using network traffic normalization, JA3/JA3S fingerprint customization, and entropy-based behavior randomization. Modify and optimize browser engines (e.g., V8, SpiderMonkey) for efficient headless rendering and fingerprint evasion. Implement containerized and distributed crawling solutions with deep knowledge of cgroups, namespaces, and container orchestration internals. Develop high-throughput data extraction and parsing systems , optimizing XPath queries, DOM mutation tracking, and real-time content differentiation. Build self-healing, resilient infrastructure for continuous, adaptive data collection at scale. Utilize AI/ML techniques to enhance dynamic crawling, adaptive evasion strategies, and anomaly detection in blocking events. What We’re Looking For Deep expertise in low-level systems (Linux internals, network protocols, kernel-level networking). Hands-on experience with browser technologies (CDP, Firefox Remote Debugging, WebDriver, Blink engine). Proficiency in advanced networking techniques , including TLS fingerprint customization, raw TCP/IP socket programming, and traffic pattern normalization. Strong security background , with experience in bypassing anti-bot mechanisms and evading detection. Experience with distributed and containerized infrastructure , including Kubernetes, Terraform, and eBPF. Familiarity with AI/ML-driven automation techniques for web data extraction. Ability to operate beneath abstraction layers , designing custom solutions for challenging scraping environments. Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Greater Nashik Area

On-site

Linkedin logo

Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Data Scientist Location: Bangalore Reporting to: Senior Manager Analytics Purpose of the role Anheuser-Busch InBev (AB InBev)’s Supply Analytics is responsible for building competitive differentiated solutions that enhance brewery efficiency through data-driven insights. We optimize processes, reduce waste, and improve productivity by leveraging advanced analytics and AI-driven solutions. As a Data Scientist you will work at the intersection of Conceptualize the analytical solution for the business problem by implementing statistical models and programming techniques. Application of machine learning solutions. Best in class cloud technology & micro-services architecture. Use DevOps best practices that include model serving, data & code versioning. Key tasks & accountabilities Develop and fine-tune Gen AI models to solve business problems, leveraging LLMs, and other advanced AI techniques. Design, implement, and optimize AI-driven solutions that enhance automation, efficiency, and decision-making. Work with cloud-based architectures to deploy and scale AI models efficiently using best-in-class microservices. Apply DevOps and MLOps best practices for model serving, data and code versioning, and continuous integration/deployment. Collaborate with cross-functional teams (engineering, business, and product teams) to translate business needs into AI-driven solutions. Ensure model interpretability, reliability, and performance, continuously improving accuracy and reducing biases. Develop internal tools and utilities to enhance the productivity of the team and streamline workflows. Maintain best coding practices, including proper documentation, testing, logging, and performance monitoring. Stay up to date with the latest advancements in Gen AI, LLMs, and deep learning to incorporate innovative approaches into projects. Qualifications, Experience, Skills Level Of Educational Attainment Required Academic degree in, but not limited to, Bachelors or master's in computer application, Computer science, or any engineering discipline. Previous Work Experience Minimum 3 years of relevant experience. Technical Skills Required Programming Languages: Proficiency in Python. Mathematics and Statistics: Strong understanding of linear algebra, calculus, probability, and statistics. Machine Learning Algorithms: Knowledge of supervised, unsupervised, and reinforcement learning techniques. Natural Language Processing (NLP): Understanding of techniques such as tokenization, POS tagging, named entity recognition, and machine translation. LLMs: Experience with Langchain, inferring from LLMs and fine tuning LLMs for specific tasks, Prompt Engineering. Data Preprocessing: Skills in data cleaning, normalization, augmentation, and handling imbalanced datasets. Database Management: Experience with SQL and NoSQL databases like MongoDB and Redis. Cloud Platforms: Familiarity with Azure and Google Cloud Platform. DevOps: Knowledge of CI/CD pipelines, Docker, Kubernetes. Other Skills Required APIs: Experience with FastAPI or Flask. Software Development: Understanding of software development lifecycle (SDLC) and Agile methodologies. And above all of this, an undying love for beer! We dream big to create future with more cheers. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Data Engineer- ETL Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Data Engineer- ETL Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies