Jobs
Interviews
115 Job openings at Hanker Systems (india)
About Hanker Systems (india)

Hanker Systems is a technology company specializing in software development and IT solutions.

Azure Data Factory ( ADF ) Developer

Bengaluru

6 - 10 years

INR 27.5 - 30.0 Lacs P.A.

Work from Office

Full Time

We are looking for an experienced Azure Data Factory (ADF) Developer to design, develop, and optimize data integration and ETL pipelines on Azure. The ideal candidate will have strong expertise in ADF, Azure Synapse, Azure Databricks, and other Azure data services. They should be skilled in ETL processes, data warehousing, and cloud-based data solutions while ensuring performance, security, and scalability. Key Responsibilities: Design and develop ETL pipelines using Azure Data Factory (ADF) to ingest, transform, and process data. Integrate ADF with other Azure services like Azure Synapse Analytics, Azure Data Lake, Azure Databricks, and SQL Database. Develop data transformations using Mapping Data Flows, SQL, and Python/PySpark. Optimize ADF performance, data flow, and cost efficiency for scalable data solutions. Automate data pipelines, scheduling, and orchestration using triggers and event-driven workflows. Troubleshoot ADF pipeline failures, performance bottlenecks, and debugging issues. Work with Azure Monitor, Log Analytics, and Application Insights for data pipeline monitoring. Ensure data security, governance, and compliance with Azure security best practices. Collaborate with data engineers, cloud architects, and business stakeholders to define data strategies. Implement CI/CD for data pipelines using Azure DevOps, Git, and Infrastructure as Code (Terraform, ARM templates, or Bicep).

ETL Tester

Chennai

8 - 12 years

INR 16.0 - 18.0 Lacs P.A.

Work from Office

Full Time

Strong SQL skills and experience working with ETL tools such as Informatica, IICS, Talend or SSIS. Proven experience as a QA engineer or similar role focused on testing Informatica Intelligent Cloud Services. Experience with testing tools such as JIRA. Strong analytical and problem-solving skills Ability to work on multiple projects and prioritize work effectively. Real time exposure in Risk Based testing concepts, process, approach, and implementation. Real time exposure in QA to QE transformation program concepts, process, approach and implementation.

Release Manager

Mumbai, Delhi / NCR, Bengaluru, Remote

5 - 9 years

INR 45.0 - 50.0 Lacs P.A.

Work from Office

Full Time

Key Responsibilities: Ensure Implementation of Release Process: Oversee the implementation and application of the documented release process across all 15-20 Salesforce orgs, ensuring consistency, efficiency, and scalability. DevOps Management with GitHub for EUR core model: Oversee the organization and management of DevOps practices using GitHub for our core European model. This includes managing the deployment of code from our central core org to local orgs. Integration and Collaboration: Work closely with local external providers who manage certain local orgs to ensure seamless integration and adherence to the core model. Training and Support: Provide training and guidance to team members on following the established release process, ensuring they understand and can execute the required steps effectively. Center of Excellence (CoE) Setup: Assist in establishing a CoE for Salesforce DevOps and release management, providing best practices, guidelines, and training to the internal team and external partners. Continuous Improvement: Identify opportunities for automation, process optimization, and continuous improvement in the release management process. Documentation and Reporting: Create and maintain comprehensive documentation of the release processes, DevOps practices, and CoE guidelines. Provide regular reports on release management performance and metrics. Required Qualifications Educational Background: Bachelors degree in Computer Science, Information Technology, or a related field. A Masters degree is a plus. Experience: o Minimum of 5 years of experience in DevOps, CI/CD, and release management. o Proven track record of implementing DevOps and release management processes in complex environments with multiple orgs. o Experience in managing DevOps using GitHub, including code branching, merging, and version control. o Previous experience with Salesforce environments and deployment strategies is highly preferred. Technical Skills: o Proficient in using GitHub and other version control systems. o Strong understanding of CI/CD tools and frameworks. o Experience with Salesforce DX and deployment tools such as Gearset, Copado, or similar. Soft Skills: o Excellent communication skills, both verbal and written. o Strong problem-solving skills and attention to detail. o Ability to work independently as well as part of a cross-functional team. o Leadership skills to guide and mentor team members and partners. Expectations The candidate should ensure the established DevOps and release management process is effectively implemented and followed across all orgs. Must be comfortable working in a dynamic environment with multiple stakeholders, including internal teams and external partners. The candidate should be proactive in identifying potential issues and providing solutions to enhance efficiency and reliability in the release process. Flexibility to adapt to different time zones and working hours as needed to collaborate with international teams. Location: Hyderabad, Ahmedabad, Pune, Kolkata, Chennai, Remote

Python Automation Engineer

Bengaluru

5 - 7 years

INR 18.0 - 20.0 Lacs P.A.

Work from Office

Full Time

Responsibilities: Server Technologies - Python-Development - Server, HPE Synergy - Hardware Design, develop, and implement automated scripts using Python for server provisioning, configuration, and management tasks. Integrate with various server hardware and software components, including HPE Synergy infrastructure. Develop and maintain robust automation frameworks to ensure scalability and maintainability. Troubleshoot and resolve issues related to automation scripts and server configurations. Collaborate with cross-functional teams (e.g., development, operations, infrastructure) to gather requirements and implement solutions. Document and maintain detailed records of all automation processes and procedures. Stay abreast of the latest advancements in server technologies, automation tools, and best practices.

Java Developer

Chennai, Delhi / NCR, Bengaluru

4 - 6 years

INR 20.0 - 22.5 Lacs P.A.

Work from Office

Full Time

Strong experience in Java, Spring Boot, Spring MVC Must have knowledge of Hibernate or JPA or any persistence layer Strong knowledge of Java EE technologies, spring framework (Core, JDBC, JMS, Messaging, Web, MVC). Experience building Restful services, SOAP web services. Experience with XML technologies like XSLT, XSD, XML parsers, and Java-XML binding frameworks. Experience with Tomcat, WebLogic servers. Database development skills in Oracle using SQL Performance tuning JEE applications and identifying performance bottleneck. Participate in design and develop applications as per technical specifications. Communicating with external web services. Ability to critically think and problem solve. Location: Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Data Engineer

Kochi, Hyderabad, Coimbatore

6 - 10 years

INR 30.0 - 35.0 Lacs P.A.

Work from Office

Full Time

1. The resource should have knowledge on Data Warehouse and Data Lake 2. Should aware of building data pipelines using Pyspark 3. Should be strong in SQL skills 4. Should have exposure to AWS environment and services like S3, EC2, EMR, Athena, Redshift etc 5. Good to have programming skills in Python

Google BigQuery Developer

Kochi, Hyderabad, Coimbatore

6 - 8 years

INR 37.5 - 40.0 Lacs P.A.

Work from Office

Full Time

6+ years of experience in data engineering / warehousing, with at least 2+ years in BigQuery and GCP. Strong expertise in SQL query optimization, BigQuery scripting, and performance tuning. Hands-on experience with ETL/ELT tools like Cloud Dataflow (Apache Beam), Cloud Composer (Airflow), dbt, Talend, Matillion, or Informatica IICS. Experience with Cloud Storage, Pub/Sub, and Dataflow for real-time and batch data ingestion. Proficiency in Python or Java for scripting and data processing tasks. Experience with semi-structured data (JSON, Avro, Parquet) and BigQuery ingestion methods. Familiarity with CI/CD pipelines, Terraform, Git, and Infrastructure as Code (IaC). Strong understanding of data governance, security policies, and compliance standards in GCP. Experience working in Agile/Scrum environments and following DevOps practices.

Snowflake Developer

Kochi, Hyderabad, Coimbatore

6 - 8 years

INR 37.5 - 40.0 Lacs P.A.

Work from Office

Full Time

Key Responsibilities: Design and implement scalable Snowflake data warehouse solutions for structured and semi-structured data. Develop ETL/ELT pipelines using Informatica IICS, dbt, Matillion, Talend, Airflow, or equivalent tools. Optimize query performance and implement best practices for cost and efficiency. Work with cloud platforms (AWS, Azure, GCP) for data integration and storage. Implement role-based access control (RBAC), security policies, and encryption within Snowflake. Perform data modeling (Star Schema, Snowflake Schema, Data Vault) and warehouse design. Collaborate with data engineers, analysts, and business teams to ensure data consistency and availability. Automate Snowflake object creation, pipeline scheduling, and monitoring. Migrate existing on-premise databases (Oracle, SQL Server, Teradata, Redshift, etc.) to Snowflake. Implement data governance, quality checks, and observability frameworks. Required Skills & Qualifications: 6+ years of experience in data engineering / warehousing with at least 2+ years in Snowflake. Strong expertise in Snowflake features such as Virtual Warehouses, Streams, Tasks, Time Travel, and Cloning. Experience in SQL performance tuning, query optimization, and stored procedures (JavaScript UDFs/ UDAFs). Hands-on experience with ETL/ELT tools like Informatica, dbt, Matillion, Talend, Airflow, or AWS Glue. Experience with Python, PySpark, or Scala for data processing. Knowledge of CI/CD pipelines, Git, Terraform, or Infrastructure as Code (IaC). Experience with semi-structured data (JSON, Parquet, Avro) and handling ingestion from APIs. Strong understanding of cloud platforms (AWS S3, Azure Data Lake, GCP BigQuery) and data lake architectures. Familiarity with BI/Analytics tools like Tableau, Power BI, Looker, or ThoughtSpot. Strong problem-solving skills and experience working in Agile/Scrum environments.

Azure Application Architect

Mumbai, Delhi / NCR, Bengaluru

8 - 12 years

INR 45.0 - 50.0 Lacs P.A.

Work from Office

Full Time

Job Summary: We are seeking a skilled Solution Design Developer / Azure Application Architect to evaluate existing MuleSoft APIs and design an optimized solution for refactoring them within the Azure platform. The ideal candidate will have expertise in Azure integration services, API management, and cloud-based architecture to ensure scalable, secure, and efficient solutions. Key Responsibilities: Review and analyze existing MuleSoft APIs to identify areas for optimization and refactoring in Azure. Design and architect Azure-based solutions using services like Azure API Management (APIM), Azure Functions, Logic Apps, and Event Grid. Ensure seamless integration with existing enterprise systems while enhancing performance and scalability. Define best practices for API design, security, governance, and lifecycle management on Azure. Collaborate with development teams to implement the refactored APIs, ensuring best practices in cloud-native architecture. Provide guidance on cost optimization, security, and compliance within the Azure ecosystem. Document architecture decisions, design patterns, and integration workflows. Required Skills & Experience: 10+ years of experience in software architecture and solution design. Expertise in MuleSoft API design and integration. Strong knowledge of Azure services, including Azure APIM, Azure Functions, Logic Apps, Service Bus, and Event Grid. Experience with microservices architecture, RESTful APIs, and cloud-based application development. Hands-on experience in modernizing legacy integrations to cloud-native solutions. Understanding of security best practices (OAuth, JWT, API Gateway security). Strong problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications: Azure certifications (e.g., Microsoft Certified: Azure Solutions Architect, Azure Developer). Experience with DevOps and CI/CD pipelines in Azure. Location : - Remote

SAP FICO Consultant

Pune, Chennai, Bengaluru

4 - 7 years

INR 25.0 - 27.5 Lacs P.A.

Work from Office

Full Time

We are looking for an experienced SAP FICO Consultant who will be responsible for configuring, implementing, and supporting SAP Finance and Controlling (FICO) modules. The ideal candidate should have strong functional knowledge of financial accounting (FI) and controlling (CO) processes and integration with other SAP modules. Key Responsibilities: Implement and configure SAP FICO modules (GL, AP, AR, Asset Accounting, Cost Center Accounting, Profit Center Accounting, Internal Orders, etc.). Gather business requirements and translate them into SAP FICO solutions. Provide functional expertise, guidance, and support in SAP FI (Finance) and CO (Controlling) modules. Work closely with business users, technical teams, and stakeholders to ensure SAP solutions align with business needs. Perform SAP system enhancements, customizations, and configurations based on business needs. Conduct unit testing, integration testing, and user acceptance testing (UAT) for SAP FICO implementations. Support data migration, cutover activities, and post-go-live support for SAP FICO. Identify and resolve SAP FICO-related issues and provide end-user training and documentation. Collaborate with other SAP module teams (MM, SD, PP, HCM) for cross-functional integration. Stay updated on SAP S/4HANA Finance trends and best practices. Required Skills & Qualifications: 4+ years of hands-on experience in SAP FICO module implementation and support. Strong functional expertise in Financial Accounting (FI) and Controlling (CO) including: General Ledger (GL), Accounts Payable (AP), Accounts Receivable (AR) Asset Accounting (AA), Bank Accounting Cost Center Accounting (CCA), Profit Center Accounting (PCA) Internal Orders, Product Costing, COPA (Profitability Analysis) Experience with SAP S/4HANA Finance is highly preferred. Understanding of FI-CO integration with MM, SD, PP, and HR modules. Knowledge of taxation, financial reporting, and IFRS/GAAP compliance. Hands-on experience in configuration, enhancements, and support within SAP FICO. Strong problem-solving skills with the ability to troubleshoot and resolve SAP FICO issues. Knowledge of ABAP debugging and SAP Tables related to FICO is a plus. Excellent communication skills to work with business and IT teams. Location- Remote,Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad, Mumbai,Hyderabad

Informatica IICS Professional

Kochi, Hyderabad, Coimbatore

6 - 8 years

INR 37.5 - 40.0 Lacs P.A.

Work from Office

Full Time

Required Skills & Qualifications: 6+ years of experience in Informatica ETL Development with at least 2+ years in Informatica IICS. Strong expertise in IICS CDI, CAI, CDI-Elastic, Taskflows, and REST/SOAP API Integration. Experience in cloud platforms (AWS, Azure, GCP) and working with databases like Snowflake, Redshift, or Synapse. Proficiency in SQL, PL/SQL, and performance tuning techniques. Knowledge of PowerCenter migration to IICS is a plus. Hands-on experience with Data Quality, Data Governance, and Master Data Management (MDM) is desirable. Experience in developing and deploying APIs, microservices, and event-driven architectures. Strong problem-solving skills and the ability to work in an Agile/Scrum environment. Preferred Qualifications: Informatica IICS Certification (CDI or CAI) is a plus. Exposure to Python, PySpark, or Big Data technologies is an advantage. Experience with CI/CD pipelines, DevOps practices, and Terraform for cloud deployments.

Full Stack Developer

Bengaluru

5 - 10 years

INR 25.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Develop product features in a full-stack arena using React, Next.js, Node.js, GraphQL, tRPC, Good to have Java with Spring Boot. Develop in micro frontend SPA architectures. Integrate with composable web platforms like Netlify, Vercel, etc. Translate business requirements into technical ones with very strong communication skills. Guide junior fellows and help them develop their skills and qualities. Help build a strong customer data platform for enabling extreme personalization across all digital touchpoints. Help the team build scalable, easy-to-maintain software to support millions of users and transactions. Good experience building front-end apps in Javascript, Next.js, ReactJS, Node.js as well as experience on backend systems using object oriented programming languages like Java Knowledge of Spring or other similar frameworks Knowledge of micro-frontend architecture Experience with BFF design patterns A creative and precise problem solver and a quick learner adapting to changing requirements in a fast paced environment Digital Product/UX understanding AWS Experience CI/CD Jenkins pipeline configuration Bachelor's degree in Computer Science or Computer Engineering

Senior Data Scientist

Pune

5 - 9 years

INR 18.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Day-to-day Responsibilities: Designing and Building machine learning systems based on business requirements and objectives. Research and implement appropriate ML algorithms and tools Perform exploratory data analysis to identify patterns, trends, and correlations in large datasets. Solving complex problems and comparing alternative solutions, trade-offs, and diverse points of view to determine a path forward. Responsible for deploying and managing machine learning models in production and ensure their scalability and efficiency. Training and testing ML models at scale in the cloud. Monitor and evaluate model performance, making necessary adjustments to improve accuracy and efficiency. Stay updated with the latest developments in machine learning and AI to bring innovative solutions to the team. Skills And Qualifications 5+ years of experience implementing and deploying machine learning and deep learning frameworks. Strong proficiency in Python and deep learning frameworks such as TensorFlow, PyTorch, or Keras Experience in productionizing ML Models Experience with natural language processing (NLP) algorithm Experience with containerization technologies (e.g., Docker) Solid understanding of statistical methods and data analysis Solid understanding of LLMs, Prompt Engineering, RAG, GraphRAG, Langchain frameworks. Solid understanding of machine learning algorithms, data structures, and software engineering principles. Familiarity with cloud platforms (e.g., AWS, Google Cloud, Azure) and their ML services Experience with version control systems like Git, Bitbucket etc Ability to work independently and proactively find solutions to challenges. Ability to collaborate effectively with team members and stakeholders.

Verification Engineer

Bengaluru

5 - 9 years

INR 40.0 - 45.0 Lacs P.A.

Work from Office

Full Time

Well versed in (Digital design, SV, UVM). Experience in DDR, PHY protocols. Responsibilities: Develop and execute comprehensive verification plans and testbenches using UVM methodologies. Write and debug complex SystemVerilog test code for functional, performance, and regression testing. Collaborate with design engineers to understand and analyze design specifications. Identify, debug, and isolate design issues. Participate in design reviews and contribute to design improvements. Maintain and enhance existing verification environments. Stay abreast of the latest verification methodologies and tools.

SAP ONB & OFB consultant

Hyderabad, Bengaluru

5 - 7 years

INR 18.0 - 20.0 Lacs P.A.

Work from Office

Full Time

We are seeking an experienced SAP ONB & OFB Consultant with a proven track record of at least two ECC implementations. The ideal candidate will be skilled in SAP Onboarding (ONB) and Offboarding (OFB) solutions, with a deep understanding of integrating these processes with SAP ECC. Key Responsibilities: Implementation & Configuration: Design, configure, and implement SAP Onboarding (ONB) and Offboarding (OFB) solutions. Integrate ONB/OFB processes with SAP ECC systems to streamline employee lifecycle management. Manage the end-to-end implementation process, including planning, configuration, testing, and deployment. Qualifications: Experience: Minimum of 5 years of experience as an SAP ONB & OFB Consultant, including at least two ECC implementations. Proven experience in configuring and supporting SAP Onboarding and Offboarding solutions. Technical Skills: Strong knowledge of SAP ECC integration with ONB and OFB modules. Familiarity with SAP SuccessFactors is a plus. Proficiency in SAP configuration and data management.

Python Developer

Kochi, Kolkata, Bhubaneswar

3 - 5 years

INR 40.0 - 45.0 Lacs P.A.

Work from Office

Full Time

Responsibilities: Design, develop, and implement server-side applications using Python and related frameworks (e.g., Django, Flask). Write clean, well-documented, and maintainable code. Participate in all phases of the software development lifecycle, from requirements gathering to deployment and maintenance. Collaborate with cross-functional teams (e.g., front-end developers, DevOps engineers) to deliver integrated solutions. Troubleshoot and debug issues in existing applications. Stay up-to-date with the latest advancements in Python and server-side technologies.

Architect and Practice Head (AI and Gen AI)

Pune

8 - 13 years

INR 35.0 - 45.0 Lacs P.A.

Remote

Full Time

Need an Architect in AI with implementing machine learning - We are seeking a highly motivated and self-driven AI Experienced Leader to head our AI and Gen AI practice. - The successful candidate will be responsible for designing and implementing AI and Gen AI-based solutions to address customer business problems. Key Responsibilities: AI Solution Consultant: - Understand customer needs and propose AI and Gen AI solutions to address business problems. AI Practice Head: - Research industry advancements in AI and Gen AI to identify practical, differentiating solutions for business problems. - Build and inspire a strong team of Data Scientists to deliver complex AI and Gen AI programs. AI Solution Architect: - Define architecture, design solutions, and implement AI and Gen AI-based solutions for customer programs. - Conduct architecture reviews, design workshops, and code reviews in collaboration with AI Data Scientists and Cloud Engineers. Implementation and Optimization: - Apply statistics, modeling, and machine learning to improve system efficiency and algorithm relevance. - Build scalable Machine Learning solutions and ensure their practical use in production environments. Skills and Qualifications: - Bachelor's or Master's degree in Computer Science, AIML, Data Sciences, or a related field. - MBA will be a plus. - 10+ years of experience in designing and implementing Machine Learning and Deep Learning models. - Proficiency in R, Python (NumPy, Scikit-learn, Pandas), TensorFlow, PyTorch, MLlib, and LLM solutions. - Strong problem-solving and analytical skills with the ability to translate business requirements into data science solutions. - Experience in scripting SQL for extracting large data sets and designing ETL flows. - Deep interest in data, metrics, analysis, trends, and a strong understanding of measurement, statistics, and program evaluation. - Effective communication skills, with the ability to convey complex technical concepts to both technical and non-technical stakeholders

Guidewire Developer

Hyderabad

5 - 10 years

INR 30.0 - 35.0 Lacs P.A.

Work from Office

Full Time

ROLE RESPONSIBILITIES Work with technical and business leaders during Guidewire Policy Center implementations and actively participating in the requirements review, assessment and high level and detailed sizing of required configuration and technical effort. Ensure technical design and specifications are in line with traceable user-stories / requirements and according to Guidewire and Sompo development best-practices. Leverage available Guidewire accelerators to accelerate development activities and make reuse of proven architectural patterns, for integrations or product development activities. Ensure that the delivered code is standards-based, G-unit tested and code-reviewed with supporting artifacts and in line with business requirements and technical specifications. Develop integration solutions including Batch processes, Message Queues, and Event Messaging. Develop integration solutions with internal and external consumers. Leverage Guidewire EDGE APIs or EDGE oriented extended models to build domain-based API solutions that are reusable. Establish and execute traceable unit and integration tests (automated as to be agreed). Write code / peer reviews and ensure that unit, integration, end-to-end, and smoke tests are conducted following continuous integration and deployment activities. Facilitate for gate-checking code prior to higher environment propagations. Support various life-cycle phases during / post-implementation, including production support & maintenance

PySpark Developer

Bengaluru

6 - 10 years

INR 30.0 - 35.0 Lacs P.A.

Work from Office

Full Time

We are seeking an experienced PySpark Developer / Data Engineer to design, develop, and optimize big data processing pipelines using Apache Spark and Python (PySpark). The ideal candidate should have expertise in distributed computing, ETL workflows, data lake architectures, and cloud-based big data solutions. Key Responsibilities: Develop and optimize ETL/ELT data pipelines using PySpark on distributed computing platforms (Hadoop, Databricks, EMR, HDInsight). Work with structured and unstructured data to perform data transformation, cleansing, and aggregation. Implement data lake and data warehouse solutions on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataflow). Optimize PySpark jobs for performance tuning, partitioning, and caching strategies. Design and implement real-time and batch data processing solutions. Integrate data pipelines with Kafka, Delta Lake, Iceberg, or Hudi for streaming and incremental updates. Ensure data security, governance, and compliance with industry best practices. Work with data scientists and analysts to prepare and process large-scale datasets for machine learning models. Collaborate with DevOps teams to deploy, monitor, and scale PySpark jobs using CI/CD pipelines, Kubernetes, and containerization. Perform unit testing and validation to ensure data integrity and reliability. Required Skills & Qualifications: 6+ years of experience in big data processing, ETL, and data engineering. Strong hands-on experience with PySpark (Apache Spark with Python). Expertise in SQL, DataFrame API, and RDD transformations. Experience with big data platforms (Hadoop, Hive, HDFS, Spark SQL). Knowledge of cloud data processing services (AWS Glue, EMR, Databricks, Azure Synapse, GCP Dataflow). Proficiency in writing optimized queries, partitioning, and indexing for performance tuning. Experience with workflow orchestration tools like Airflow, Oozie, or Prefect. Familiarity with containerization and deployment using Docker, Kubernetes, and CI/CD pipelines. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, CCPA, etc.). Excellent problem-solving, debugging, and performance optimization skills.

Domestic IT Recruiter

Hyderabad

0 - 5 years

INR 2.0 - 5.5 Lacs P.A.

Work from Office

Full Time

Are you a passionate individual looking to start or grow your career in IT recruitment? Join our dynamic team at Hanker Systems where you'll get hands-on experience sourcing, screening, and engaging top tech talent. What Were Looking For: Strong communication & interpersonal skills Passion for recruitment and talent acquisition Willingness to work in a fast-paced environment Freshers with good attitude and MBA HR background are welcome!

FIND ON MAP

Hanker Systems (india)

Hanker Systems (india) logo

Hanker Systems (india)

|

Information Technology

Gurgaon

51 - 200 Employees

115 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview