About the Role: We are looking for a passionate and result-driven Recruiter who can manage end-to-end recruitment and take full ownership of open positions. The ideal candidate will be confident in handling diverse technology roles, from networking to development, and have a proven track record of closing positions efficiently. Key Responsibilities: Manage the complete recruitment lifecycle: requirement gathering, sourcing, screening, coordinating interviews, feedback collection, and offer negotiation. Partner with hiring managers and stakeholders to understand business needs and deliver quality hires. Source candidates through various platforms including Naukri, LinkedIn, Indeed, internal databases, referrals, etc. Build and maintain a strong pipeline of qualified candidates across a variety of roles. Provide timely updates and ensure a smooth candidate experience. Maintain recruitment trackers and reports for internal reviews. Key Skills & Experience: 6+ years of experience in end-to-end IT recruitment. Strong sourcing skills using job portals (Naukri, LinkedIn, etc.), social media, and other modern channels. Proven experience hiring for roles such as Network Engineer, Project Manager, Business Analyst, .NET Developer, Java Developer, Full Stack Developer, DevOps Engineer, Data Engineer Excellent communication and stakeholder management skills. Ability to multitask and prioritize in a fast-paced environment. Self-starter with a strong sense of ownership and accountability.
About the Role: We are looking for a passionate and detail-oriented Software Tester with experience in both manual and automation testing. The ideal candidate will play a crucial role in ensuring the quality and reliability of our applications by designing and executing test cases, identifying bugs, and automating repetitive test scenarios. Key Responsibilities: Understand requirements and prepare detailed, comprehensive, and well-structured test plans and test cases Perform manual testing on web/mobile applications, APIs, and backend systems Develop, maintain, and execute automated test scripts using tools like Selenium, TestNG, JUnit, Postman, etc. Collaborate closely with developers, product managers, and other QA team members to resolve issues and ensure product quality Perform regression, functional, integration, and system testing Log defects clearly and track them through resolution using tools like JIRA or Bugzilla Contribute to test automation frameworks and maintain reusable code Participate in sprint planning, daily stand-ups, and review meetings (Agile environment) Required Skills: 5-6 years of experience in software testing (both manual and automation) Experience in testing CMS - AEM, Drupal, Silver stripe, WordPress or any other CMS Experience in Payment Gateway domain testing Proficiency in automation tools like Selenium WebDriver, Postman, or similar Experience with Java, Python, or JavaScript for automation scripting Solid understanding of SDLC, STLC, and various testing methodologies Good knowledge of API testing and tools like REST Assured, Postman Strong analytical, debugging, and documentation skills Excellent verbal and written communication skills Preferred Qualifications: Experience with performance testing tools like JMeter Knowledge of BDD frameworks like Cucumber Exposure to cloud platforms (AWS, Azure) is a plus ISTQB Certification is an added advantage
CACI International Inc is an American multinational professional services and information technology company headquartered in Northern Virginia. CACI provides expertise and technology to enterprise and mission customers in support of national security missions and government transformation for defense, intelligence, and civilian customers. CACI has approximately 23,000 employees worldwide. Headquartered in London, CACI Ltd is a wholly owned subsidiary of CACI International Inc., a publicly listed company on the NYSE with annual revenue in excess of US $6.2bn. Founded in 2022, CACI India is an exciting, growing and progressive business unit of CACI Ltd. CACI Ltd currently has over 2000 intelligent professionals and are now adding many more from our Hyderabad and Pune offices. Through a rigorous emphasis on quality, the CACI India has grown considerably to become one of the UKs most well-respected Technology centres. About The Data Platform The Data Platform supports and enables a Data Mesh organisation. It uses AWS technology to provide a unified experience across AWS services to deliver an open federated Lakehouse, and a unified User Experience. The Data Platform focusses on enabling decentralised management, processing, analysis and delivery of data, while enforcing corporate wide federated governance on data, and project environments across business domains. The goal is to empower multiple teams to create and manage high integrity data and data products that are analytics and AI ready and consumed internally and externally. What does a Senior Data Engineer do? A Senior Data Engineer partners closely with business units to maintain existing cloud data architectures, as well as assess, design and execute the migration of their existing data products and workflows onto a modern cloud data platform This involves understanding current data architectures, dependencies and transformation logic, and translating that into cloud native solutions in harmony with the Company data platform, governance and strategy. The Senior Data Engineer will need skills in developing data pipelines that operate across a medallion architecture, with an obsession on data quality and integrity of the products and processes and always considering the cost benefit of different approaches. This will need attention to technical rigour and pragmatic delivery. As a platform team our objective is to enable the business to operate their data architectures themselves, albeit with support from the Platform Team. In this role, the Senior Data Engineer will collaborate closely with business units and be generous with knowledge and aim to hand over solutions for continued management by data analysts and citizen data engineers. You will leverage skills in AWS Services such as Glue, EMR, S3, MWAA (Apache Airflow), Step Functions, Redshift, Sagemaker Unified Studio, as well as an understanding of traditional RDBMS in Postgres, Oracle and SQL Server. SQL, Python and PySpark will be essential in this role, as will experience with IaC such as cloud formation, to contribute to deploy solutions and to contribute to the build and maintenance of the platform itself. You will be able to design architectures and create re-useable solutions to reflect the business needs. It is important that we create Patterns that we can reuse for multiple products and services, so that we are note designing new solutions for every need. Some products will have linear data ingest, transformation and publishing requirements, others will require extensive analytical work in the pipelines, including machine learning, deep learning and the potential to integrate more unstructured data as new products are built and matured. Responsibilities will include: Collaborating across CACI departments to develop and maintain data products and the data platform Designing and implementing data processing environments and integrations using AWS PaaS such as Glue, S3, Lambda, Fargate, EMR, Sagemaker, Redshift, Aurora and Snowflake Data architecture and data modelling across full data lifecycles, as well as more detailed modelling requirements of databases and data products. Building data processing and analytics pipelines as code, using python, SQL, PySpark, Spark, CloudFormation, Lambda, Step Functions, Apache Airflow Monitoring and reporting on the data platform and data product performance, usage and security Designing and applying security and access control architectures to secure sensitive data Enabling business units by working with them to deliver complete and manageable solutions; while providing support and expert advice. You will have: Strong experience and knowledge of data architectures implemented in AWS using native AWS services such as S3, DataZone, Glue, EMR, Sagemaker, Aurora and Redshift using Python, PySpark, and SQL Experience developing and administrating databases, data platforms and solutions Good coding discipline in terms of style, structure, versioning, documentation and unit tests A well developed understanding of a data mesh organisation, as well as Master and Reference Data Management Experience migrating legacy to modern, as well as modern to modern. Knowledge and experience of relational databases such as Postgres, Redshift Experience using Git for code versioning, and lifecycle management Experience operating to Agile principles and ceremonies Hands-on experience with CI/CD tools such as GitLab Strong problem-solving skills and ability to work independently or in a team environment. Excellent communication and collaboration skills. A keen eye for detail, and a passion for accuracy and correctness in numbers Whilst not essential, the following skills would also be useful: Experience using Jira, or other agile project management and issue tracking software Experience with Spatial Data Processing, technology and approaches. Experience with Machine Learning and AI workflows
FIND ON MAP