Jobs
Interviews

KTwo Hiring

6 Job openings at KTwo Hiring
Payroll Lead India 5 years Not disclosed Remote Full Time

Payroll Lead – Remote | Fast-growing Global Tech Company We’re looking for a hands-on Payroll Lead who’s ready to take full ownership of payroll across India, the US, and Singapore for a scaling global team (300+ employees and growing fast). This isn’t just about running payroll — it’s about building a function that’s accurate, scalable, and employee-first. What you’ll own: ● End-to-end payroll execution (India, US, SG) ● Managing vendor tools like Keka, Sequoia One, EORs ● Ensuring 100% compliance across all jurisdictions ● Building better systems and automations for scale ● Being the go-to person for everything payroll: solving, communicating, reporting You’ll thrive here if you: ● Have 5+ years of hands-on payroll experience across multiple countries ● Have worked with fast-growing companies or handled scale independently ● Enjoy building systems, not just maintaining them ● Want ownership and clarity, not chaos This is a full-time, remote role in a remote-first company where trust, autonomy, and impact are everything. You’ll work closely with finance, HR, and leadership to shape payroll as a strategic function. Let me know if you’d like to take the conversation forward — would love to tell you more! Show more Show less

SWIFT PAYMENTS CONSULTANT Gurugram,Haryana,India 0 years Not disclosed Remote Full Time

Provides 24 x 7 support for NABs Swift products for international & Domestic payments and services. The Role: You will be part of the SWIFT payments engineering team, supporting and maintaining Australia's most extensive international payment transfer system. This role has a mandatory requirement to be on an oncall 24 x 7 rostered Support, with scheduled weekend roster You will work with various teams across the bank, particularly group security, risk and assurance, IT infrastructure, projects, the compliance team, and third-party vendors. Your opportunity: Improve processes and codes to optimise application performance throughput. Improve the onboarding and maintenance of user data, streamlining and decreasing the time to onboard users. Improve the application cyber security hygiene. Participate with a variety of large-scale projects across to support NAB payment strategy. Opportunity to learn Cyber Security improvements and implement to the system. Main responsibilities: Support and maintain critical payments applications, 24/7 includes on-call duties and weekend work. Maintain the application hygiene in line with Group Information Risk Policy. Highly desirable - Ability to diagnose and troubleshoot NAB Swift products across a. Swift Alliance Access b. Swift Alliance Gateway c. Swift Alliance Gateway VPNs d. Swift Browse services e. Swift SWP services f. Swift HSM g. Swift Micro gateway h. Support other Swift int services not limited to HK, New York, Bank of England, MEPS+ i. Ability to diagnose and troubleshoot Swift messages across its services (MT / MX). j. Swift Certificate renewal K. Ability to diagnose and troubleshoot Swift batch processing & Control M. L. Ability to diagnose and troubleshoot networking connectivity issues from the end users M. Ability to diagnose and troubleshoot users access across CyberArk, local account, scripts, Splunk, Cribl. Ability to diagnose and troubleshoot issues with NAB Azure AVD swift design, NAB Azure Jump hosts, NAB Azure Ansible hosts, NAB on Prem AIX and RHEL design. Highly desirable – the ability to: Deploy Swift quarterly security patches on Red Hat servers Remediation of vulnerability (CTRs, VITs and Penetration) items across the hosts, infrastructure, and application level. Maintenance of all accounts, passwords, certificates within the GIRP SLA. Highly desirable - In depth understanding of Swift Cyber Security program. Highly desirable development of high quality system changes, runbooks. Ability to conduct disaster recovery plans; application restore and backup. Ability to gather system data to produce Capacity plans. What you’ll bring… Essential capabilities - Experience in working in an 24 x 7 Support environments (including Weekend Work) Experience in working across onshore/ offshore support environment supporting customer critical application Swift Experience and Exposure to swift messaging flow Experience in implementing, supporting, and enhancing applications, ideally with Banking and/or Finance industry exposure. Experience with following: Firewall reviews and remediation. Threat analysis. Redhat, AIX and Windows Operating systems. Azure virtual desktops and cloud technology. Swift Cyber Security Program and Swift Attestations. Working with Principal engineers and security architects. Understanding of company security vulnerabilities, penetration testing and security policies and how to enforce it. Desirable skills, Payment’s experience (domestic and international); Understanding of firewalls, networking, secure zones and remote access. Understanding disaster recovery, segregation of duties, cyber security attestation. Show more Show less

Sailpoint Engineer (IAM) Development Gurugram,Haryana,India 0 years Not disclosed On-site Full Time

 Extensive knowledge and proven experience of SailPoint Identity IQ  Ability to triage configuration defects, suggest remediations, update designs and configure the solution  Proven experience of certification configuration including dynamic reviewers, consultation, escalation etc  Awareness of the SNOW/SailPoint module  Understanding of deployment and change processes, Dev, Pre-Prod to Prod, SIT and UAT testing  Creation of dynamic reports to support audit requests and user access visibility  Prioritisation and organisation skills to ensure any work related to audit is prioritised.  Design and implement SailPoint IdentityIQ solutions, including user provisioning, de-provisioning, and access certifications.  Configuration of workflows, roles, policies, and connectors for various enterprise applications.  Understanding of and experience in managing and maintain role-based access control (RBAC) and attribute-based access control (ABAC) frameworks.  Experience in Integrating SailPoint with existing enterprise systems, including Active Directory, cloud platforms (e.g., AWS, Azure), and third-party applications. Show more Show less

Sailpoint Engineer (IAM) Development haryana 3 - 7 years INR Not disclosed On-site Full Time

As an individual with extensive knowledge and proven experience in SailPoint Identity IQ, you will be responsible for triaging configuration defects, suggesting remediations, updating designs, and configuring solutions. Your role will include certification configuration tasks such as implementing dynamic reviewers, providing consultation, and managing escalations. It is essential to have awareness of the SNOW/SailPoint module and be familiar with deployment and change processes from Dev to Prod environments, including SIT and UAT testing. You will be expected to create dynamic reports that support audit requests and enhance user access visibility. Your prioritization and organizational skills will play a crucial role in ensuring that any work related to audits is effectively managed and completed in a timely manner. Furthermore, you will design and implement SailPoint IdentityIQ solutions, focusing on user provisioning, de-provisioning, and access certifications. Your responsibilities will also include configuring workflows, roles, policies, and connectors for various enterprise applications. Experience in managing role-based access control (RBAC) and attribute-based access control (ABAC) frameworks is required. Additionally, you will be involved in integrating SailPoint with existing enterprise systems such as Active Directory, cloud platforms like AWS and Azure, and third-party applications. Overall, your expertise in SailPoint Identity IQ and your ability to effectively manage configuration, certification, deployment, and integration tasks will be instrumental in ensuring the security and efficiency of the organization's identity and access management processes.,

ML Platform Specialist haryana 3 - 7 years INR Not disclosed On-site Full Time

The ML Platform Specialist role involves designing, implementing, and maintaining machine learning infrastructure and workflows on the Databricks Lakehouse Platform. Your primary responsibility will be to ensure the successful deployment, monitoring, and scaling of machine learning models across the organization. You will design and implement scalable ML infrastructure on the Databricks Lakehouse Platform, develop CI/CD pipelines for machine learning models, and create automated testing and validation processes using Databricks MLflow. Additionally, you will be responsible for managing model monitoring systems, collaborating with various teams to optimize machine learning workflows, and maintaining reproducible machine learning environments using Databricks Notebooks and clusters. Furthermore, you will implement advanced feature engineering and management using the Databricks Feature Store, optimize machine learning model performance, and ensure data governance, security, and compliance within the Databricks environment. Your role will also involve creating and maintaining comprehensive documentation for ML infrastructure and processes, as well as working across teams from multiple suppliers to drive continuous improvement and transformation initiatives for MLOps/DataOps. To be successful in this role, you should have a Bachelor's or Master's degree in computer science, Machine Learning, Data Engineering, or a related field, along with 3-5 years of experience in ML Ops with expertise in Databricks and/or Azure ML. You should possess advanced proficiency with the Databricks Lakehouse Platform, strong experience with Databricks MLflow, and expert-level programming skills in Python, including knowledge of PySpark, MLlib, Delta Lake, and Azure ML SDK. Moreover, you should have a deep understanding of Databricks Feature Store and Feature Engineering techniques, experience with Databricks workflows and job scheduling, and proficiency in machine learning frameworks compatible with Databricks and Azure ML such as TensorFlow, PyTorch, and scikit-learn. Knowledge of cloud platforms like Azure Databricks, Azure DevOps, and Azure ML, as well as exposure to Terraform, ARM/BICEP, distributed computing, and big data processing techniques, will be essential for this role. Experience with Containerization, WebApps Kubernetes, Cognitive Services, and other MLOps tools will be considered a plus, as you contribute to the continuous improvement and transformation of MLOps/DataOps in the organization.,

Data Product Engineering Specialist haryana 5 - 9 years INR Not disclosed On-site Full Time

The Data Product Engineering Specialist will be responsible for designing, building, and optimizing strategic data assets that enable advanced analytics, reporting, and operational efficiencies. This role sits at the intersection of data designs, product management, and business strategy, ensuring that data assets are structured, governed, and made accessible in a scalable and reusable manner. You will design, build, and maintain scalable data products to support analytics, AI/ML, and operational business needs. Develop high-quality data pipelines and reusable configurable frameworks, ensuring robust, efficient, and scalable data processing. Implement data transformation and enrichment processes to make raw data useful for pricing, risk modeling, claims, and other business functions. Ensure adherence to data modeling standards, reference data alignment, and data product governance frameworks. Work closely with Product Managers, Designers, and Domain SMEs to embed best practices and standards. You will leverage cloud-based technologies and modern data platforms focusing on Azure data services and Databricks. Ensure solutions align with data security, privacy, and governance policies. Engage with platform teams to ensure efficient deployment and performance optimization of data solutions. Develop automated test routines and conduct thorough testing to ensure quality deliverables. Integrate solutions with data management tools such as Purview to automate data quality rules implementation, metadata management, and tagging. Develop clean and precise documentation including low-level designs, release notes, and how-to guides. Support Data product launch and adoption activities. Keep up to date with new skills - Develop technology skills in other areas of Platform. Stay updated on emerging data technologies and continuously enhance data products to improve efficiency and business value. Optimize data processing performance and cost-efficiency by leveraging automation and modern engineering best practices. Identify opportunities for AI/ML integration and automation within data pipelines and business processes. You should have an advanced understanding of building and deploying production-grade solutions in complex projects. Strong experience in data engineering, data modeling, and pipeline development. Proficiency in SQL, Python, Spark, Databricks, and cloud-based data engineering tools (Azure preferred, AWS, GCP). Strong experience in software engineering practices and implementation. Experience with big data processing, streaming architectures (Kafka, Event Hub), and real-time data solutions. Understanding of data governance, metadata management, and data lineage frameworks. Knowledge of CI/CD practices, DevOps, and Infrastructure as Code (IaC) for data solutions. Knowledge of using advanced tools and techniques to identify and address data quality issues. Strong stakeholder management, able to influence and drive alignment across cross-functional teams. Excellent problem-solving and analytical skills with a product mindset. Ability to work in agile environments, contributing to product roadmaps and iterative development cycles.,