Jobs
Interviews

Tacnique

13 Job openings at Tacnique
Senior Manual Test Engineer Mumbai,Maharashtra,India 0 years Not disclosed On-site Full Time

About the Role: We are seeking an experienced and detail-oriented Senior QA Engineer with expertise in Manual Testing to join our team in Mumbai. As a Senior Engineer, you will play a key role in ensuring the quality and reliability of our applications and systems. This is a fantastic opportunity to work in a collaborative environment and lead a talented team of QA professionals. Key Responsibilities: Lead the end-to-end quality assurance process for software applications, ensuring all deliverables meet the highest quality standards. Develop, implement, and maintain detailed test plans, test cases, and test scripts. Conduct comprehensive manual testing, including functional, regression, integration, and system testing. Collaborate with developers, product managers, and other stakeholders to understand requirements and resolve issues. Identify, document, and track defects using appropriate tools and follow up on resolutions. Mentor and guide the QA team, ensuring best practices and standards are adhered to. Continuously improve testing processes to enhance efficiency and effectiveness. Stay updated with the latest trends and advancements in manual testing and QA methodologies. Required Skills and Qualifications: 5+ years of experience in manual testing, with a proven track record of leading QA efforts in complex projects. Expertise in writing and executing test plans, test cases, and test scenarios. Strong understanding of software development life cycle (SDLC) and software testing life cycle (STLC). Proficient in defect tracking tools such as JIRA, Bugzilla, Git, or similar. Experience in testing web and mobile applications. Experience in security testing to identify vulnerabilities, ensure compliance with standards, and safeguard systems from potential threats. Ability to track and lead the release process, while defining and monitoring performance metrics to measure product quality and success. Possesses an AI mindset — approaches work with curiosity about intelligent tools, awareness of their limitations, and a focus on using them responsibly for client value Excellent analytical, problem-solving, and debugging skills. Strong communication and interpersonal skills, with a leadership mindset. Knowledge of automation testing tools is a plus. Familiarity with API testing tools such as Postman or Swagger. Experience in performance and load testing is an added advantage. Understanding of Agile development methodologies. Show more Show less

Senior Java Software Engineer Pune,Maharashtra,India 4 years Not disclosed On-site Full Time

About Ajackus: Ajackus is a leading technology solutions provider that helps businesses achieve their goals through innovative software solutions. We are committed to delivering high-quality services and building long-term relationships with our clients. Our team of experts is dedicated to creating impactful and efficient solutions that drive success. Job Description: We are looking for a Senior Java Developer with 4+ years of experience who is skilled in Java, Spring Boot, PostgreSQL, NoSQL, and Maven. If you have experience in React, it will be an added advantage. Join us to contribute to innovative projects and be part of a growth-driven team. Key Responsibilities: Develop and maintain robust backend systems using Java and Spring Boot . Design and implement database structures, ensuring optimized performance for PostgreSQL and NoSQL databases. Utilize Maven for dependency management and build automation. Collaborate with the team to define and implement features that meet client requirements. Write clean, well-documented, and efficient code while adhering to coding standards. Troubleshoot, debug and upgrade software solutions to ensure reliability. Optionally contribute to front-end development using React when required. Qualifications : 4+ years of hands-on experience in software development using Java and Spring Boot . Proficient in working with PostgreSQL and NoSQL databases. Familiarity with build tools such as Maven . Good to have experience with React for front-end development. Strong problem-solving skills and the ability to work independently or in a team. A Bachelor’s degree in Computer Science, Engineering, or a related field. Show more Show less

Data Engineer – Databricks & Spark India 8 years None Not disclosed Remote Full Time

Senior Data Engineer–Big Data & Analytics Location: Remote (CET Time Zone Preferred) Tech Stack: Databricks, SQL, Spark, Python, AWS (ECS, S3) Desired Experience: 3 to 8 Yrs About the Project: Join an innovative project focused on revolutionizing how companies manage and optimize their promotional activities. This data-driven solution enables businesses to analyze the effectiveness of promotions, plan more impactful campaigns, and integrate seamlessly with existing ecosystems. Leveraging machine learning and advanced analytics, the platform enhances vendor collaboration, improves deal negotiation, and maximizes return on investment (ROI). Role: We are seeking a highly skilled Senior Data Engineer with deep expertise in Databricks, SQL, and Apache Spark to design, build, and optimize scalable data pipelines. The ideal candidate thrives in a collaborative environment, enjoys problem-solving, and can work closely with stakeholders to clarify requirements and ensure high-quality data delivery. Strong experience in ETL processes , data modeling , and analytics engineering is desired, along with familiarity with DBT . Responsibilities: Design, build, and optimize scalable data pipelines using Databricks, SQL, and Apache Spark Develop and maintain robust ETL processes for large-scale data systems Collaborate with cross-functional teams to understand business needs and translate them into data solutions Ensure high performance and quality of data workflows Support machine learning and analytics teams with curated, reliable datasets Implement best practices in data engineering, testing, and documentation Qualifications: Deep expertise in Databricks, Apache Spark, and SQL 3–8 years of hands-on experience in designing and maintaining scalable ETL pipelines, with a strong foundation in data engineering best practices and cloud platforms Strong experience with data modeling and ETL pipeline development Proficient in Python for data manipulation and scripting Hands-on experience with AWS (especially S3 and ECS ) Comfortable working in a Linux/Ubuntu environment Familiarity with dbt (data build tool) for managing data transformations Prior experience working in promotional optimization or the retail analytics domain Strong problem-solving skills, preferring logical and innovative solutions over legacy experience Excellent communication skills with the ability to clarify requirements directly with stakeholders Willingness to engage in pair programming and collaborative problem-solving Proactive, self-driven, and able to work independently in a distributed team environment

Senior Data Engineer India 8 years None Not disclosed Remote Full Time

Senior Data Engineer–Big Data & Analytics Location: Remote (CET Time Zone Preferred) Tech Stack: Databricks, SQL, Spark, Python, AWS (ECS, S3) Desired Experience: 3 to 8 Yrs About the Project: Join an innovative project focused on revolutionizing how companies manage and optimize their promotional activities. This data-driven solution enables businesses to analyze the effectiveness of promotions, plan more impactful campaigns, and integrate seamlessly with existing ecosystems. Leveraging machine learning and advanced analytics, the platform enhances vendor collaboration, improves deal negotiation, and maximizes return on investment (ROI). Role: We are seeking a highly skilled Senior Data Engineer with deep expertise in Databricks, SQL, and Apache Spark to design, build, and optimize scalable data pipelines. The ideal candidate thrives in a collaborative environment, enjoys problem-solving, and can work closely with stakeholders to clarify requirements and ensure high-quality data delivery. Strong experience in ETL processes , data modeling , and analytics engineering is desired, along with familiarity with DBT . Responsibilities: Design, build, and optimize scalable data pipelines using Databricks, SQL, and Apache Spark Develop and maintain robust ETL processes for large-scale data systems Collaborate with cross-functional teams to understand business needs and translate them into data solutions Ensure high performance and quality of data workflows Support machine learning and analytics teams with curated, reliable datasets Implement best practices in data engineering, testing, and documentation Qualifications: Deep expertise in Databricks, Apache Spark, and SQL 3–8 years of hands-on experience in designing and maintaining scalable ETL pipelines, with a strong foundation in data engineering best practices and cloud platforms Strong experience with data modeling and ETL pipeline development Proficient in Python for data manipulation and scripting Hands-on experience with AWS (especially S3 and ECS ) Comfortable working in a Linux/Ubuntu environment Familiarity with dbt (data build tool) for managing data transformations Prior experience working in promotional optimization or the retail analytics domain Strong problem-solving skills, preferring logical and innovative solutions over legacy experience Excellent communication skills with the ability to clarify requirements directly with stakeholders Willingness to engage in pair programming and collaborative problem-solving Proactive, self-driven, and able to work independently in a distributed team environment

Senior Big Data Engineer India 8 years None Not disclosed Remote Full Time

Senior Data Engineer–Big Data & Analytics Location: Remote (CET Time Zone Preferred) Tech Stack: Databricks, SQL, Spark, Python, AWS (ECS, S3) Desired Experience: 3 to 8 Yrs About the Project: Join an innovative project focused on revolutionizing how companies manage and optimize their promotional activities. This data-driven solution enables businesses to analyze the effectiveness of promotions, plan more impactful campaigns, and integrate seamlessly with existing ecosystems. Leveraging machine learning and advanced analytics, the platform enhances vendor collaboration, improves deal negotiation, and maximizes return on investment (ROI). Role: We are seeking a highly skilled Senior Data Engineer with deep expertise in Databricks, SQL, and Apache Spark to design, build, and optimize scalable data pipelines. The ideal candidate thrives in a collaborative environment, enjoys problem-solving, and can work closely with stakeholders to clarify requirements and ensure high-quality data delivery. Strong experience in ETL processes , data modeling , and analytics engineering is desired, along with familiarity with DBT . Responsibilities: Design, build, and optimize scalable data pipelines using Databricks, SQL, and Apache Spark Develop and maintain robust ETL processes for large-scale data systems Collaborate with cross-functional teams to understand business needs and translate them into data solutions Ensure high performance and quality of data workflows Support machine learning and analytics teams with curated, reliable datasets Implement best practices in data engineering, testing, and documentation Qualifications: Deep expertise in Databricks, Apache Spark, and SQL 3–8 years of hands-on experience in designing and maintaining scalable ETL pipelines, with a strong foundation in data engineering best practices and cloud platforms Strong experience with data modeling and ETL pipeline development Proficient in Python for data manipulation and scripting Hands-on experience with AWS (especially S3 and ECS ) Comfortable working in a Linux/Ubuntu environment Familiarity with dbt (data build tool) for managing data transformations Prior experience working in promotional optimization or the retail analytics domain Strong problem-solving skills, preferring logical and innovative solutions over legacy experience Excellent communication skills with the ability to clarify requirements directly with stakeholders Willingness to engage in pair programming and collaborative problem-solving Proactive, self-driven, and able to work independently in a distributed team environment

Senior Workday Developer Gurugram,Haryana,India 0 years None Not disclosed On-site Full Time

About the Role We are seeking a skilled and experienced Workday Extend Developer to lead the end-to-end implementation of five business-critical applications using Workday Extend. This role plays a pivotal part in NAB’s strategic initiative to build a scalable and sustainable foundation for future Extend applications. You will be responsible for setting up the development environment, designing and deploying applications, and transferring knowledge to internal teams. This opportunity offers the chance to work in a high-impact, Agile environment where you will collaborate with stakeholders and deliver innovative Workday solutions to support key HR and operational processes. Responsibilities 1. Environment Setup Provision and configure the Workday Extend development environment. Set up tenant security, permissions, and deployment frameworks. Establish DevOps processes, including build/test/deploy pipelines 2. Application Development & Deployment Deliver five key Workday Extend applications: UC1 – Work Schedule Management UC2 – TAF File Number Onboarding & Validation UC3 – Super Choice Forms & Validations UC4 – Higher Duty Allowance for Acting Managers UC5 – Position Attribute Management Engage in full lifecycle development: Validate requirements and align with stakeholders. Design and build applications using Workday Extend Studio. Conduct unit and integration testing. Deploy to non-production and production tenants. Provide documentation and conduct knowledge transfer to internal teams. 3. Agile Delivery Collaborate in Agile teams through sprint planning, reviews, retrospectives, and daily stand-ups. Ensure timely delivery of high-quality applications in line with project milestones. 4. Outcome & Deliverables A fully operational Workday Extend development environment. Five tested and deployed Workday Extend applications. Knowledge transfer documentation to enable long-term support and scalability by internal teams. Qualifications 3 to 9 yrs of hands-on experience working with Workday Extend. Proven experience with Workday Extend: design, development, and deployment. Strong understanding of the Workday Object Model and tenant security frameworks. Proficiency in XSLT, JSON, XPath, and REST APIs. Experience with Workday Studio and Workday Web Services (WWS). Demonstrated capability with Extend UI components and custom business logic. Proficient in JavaScript for client-side scripting within Extend applications. Basic knowledge of Java for Studio-based integrations. Familiarity with CI/CD tools (e.g., Git, Jenkins, or Workday-supported pipelines). Understanding of Workday environments: sandbox, preview, and production. Ability to work effectively within an Agile delivery framework. Preferred Qualifications Workday Extend Certification (if applicable). Prior experience in the financial or banking sector (nice to have).

Senior Software Engineer-Java India 4 years None Not disclosed On-site Full Time

Role: We are looking for a Java Developer with 4+ years of experience who is skilled in Java, Spring Boot, PostgreSQL, NoSQL, and Maven. If you have experience in React, it will be an added advantage. Join us to contribute to innovative projects and be part of a growth-driven team. Responsibilities: Develop and maintain robust backend systems using Java and Spring Boot . Design and implement database structures, ensuring optimized performance for PostgreSQL and NoSQL databases. Utilize Maven for dependency management and build automation. Collaborate with the team to define and implement features that meet client requirements. Write clean, well-documented, and efficient code while adhering to coding standards. Troubleshoot, debug, and upgrade software solutions to ensure reliability. Optionally contribute to front-end development using React when required. Qualifications: 4+ years of hands-on experience in software development using Java and Spring Boot . Proficient in working with PostgreSQL and NoSQL databases. Familiarity with build tools such as Maven . Good-to-have experience with React for front-end development. Strong problem-solving skills and the ability to work independently or in a team. A Bachelor’s degree in Computer Science, Engineering, or a related field.

AI & Machine Learning Engineer Pune,Maharashtra,India 4 years None 2e-05 - 2e-05 Lacs P.A. Remote Full Time

AI Engineer – Computer Vision, NLP & Deep Learning Type, Location, Full Time @ Pune Desired Experience 4+ years Job Description Role Develop, fine-tune, and deploy deep learning models for computer vision and NLP use cases using PyTorch or TensorFlow. Design and maintain model pipelines, data loaders, and API layers in Python for scalable inference and integration. Implement frontend and backend integrations of AI models using TypeScript/JavaScript (e.g., Node.js, LangChain.js, Transformers.js). Work with datasets for annotation, augmentation, and visualization using tools like LabelImg, Albumentations, and FiftyOne. Build semantic search and recommendation systems using vector databases like Pinecone, Weaviate, or Milvus. Integrate NoSQL and graph-based storage systems like MongoDB and Neo4j for AI-related data operations. Collaborate with cross-functional teams to deliver production-ready, observable, and testable ML components. Contribute to infrastructure and CI/CD for AI model deployment and versioning. Document architectures, APIs, model behavior, and performance tuning guidelines. Participate in sprint planning, reviews, and architecture discussions within a remote-first engineering team. Qualifications 5+ years of experience in AI/ML engineering with deep expertise in computer vision, NLP, and model lifecycle management. Advanced proficiency in Python for deep learning, data pipelines, and API development. Hands-on experience with PyTorch or TensorFlow , with strong skills in training, fine-tuning, and optimizing DL models. Experience in TypeScript/JavaScript for integrating AI models into web applications using Node.js or frontend frameworks. Familiarity with C++ for performance-critical or system-level tasks is a plus. Experience working with vision tools (OpenCV, Detectron2/MMDetection) and managing large annotated datasets. Strong understanding of NoSQL (MongoDB, Redis), vector databases (Pinecone, Milvus), and graph DBs (Neo4j). Proficiency in building scalable ML services, with observability (logging, metrics, alerting) and test coverage. Exposure to orchestration tools (Airflow, Prefect) and cloud deployment workflows (Azure preferred). Comfortable working in remote, agile teams with strong communication and problem-solving skills. Ability to work independently and drive AI projects from experimentation to production.

Senior Software Engineer Devops Hyderabad,Telangana,India 4 years None Not disclosed On-site Full Time

About the Role: We are seeking a highly skilled DevOps Engineer to join our dynamic team. In this role, you will play a key part in designing, implementing, and maintaining our infrastructure and CI/CD pipelines. Your expertise in Kubernetes, Helm, Docker, AWS, Terraform, and Git will be critical to ensuring our systems are scalable, reliable, and efficient. Additionally, familiarity with GitLab CI/CD, Datadog / observability, monitoring and Python scripting is highly desirable. Key Responsibilities Infrastructure as Code (IaC): Design and manage infrastructure using Terraform to automate deployment processes. Container Orchestration: Deploy, maintain, and optimize applications using Docker, Kubernetes and Helm . Cloud Management: Architect and manage cloud infrastructure in AWS , ensuring high availability and scalability. Version Control: Manage and optimize workflows using Git for source code and infrastructure repositories. Pipeline Automation: Develop and maintain CI/CD pipelines, leveraging tools like GitLab CI/CD where applicable. Monitoring & Troubleshooting: Implement robust monitoring systems and troubleshoot infrastructure or application issues. Collaboration: Partner with development, QA, and security teams to ensure seamless delivery and system stability. Documentation: Create and maintain comprehensive documentation for processes, configurations, and standards. Required Skills Strong Proficient in Kubernetes and Docker and container orchestration tools. Strong experience with Helm for package management in Kubernetes. Advanced knowledge of AWS (e.g., EKS, VPC, RDS, IAM, ECS, ECR, EC2, S3, CloudFormation, and others). Expertise in Terraform , including authoring IAC, modules, managing and debugging deployment states. Solid understanding and hands-on experience with Git version control. Nice-to-Have Skills Experience with GitLab CI/CD or other CI/CD tools. Proficiency in Python scripting for automation and tooling. Expertise in AWS Cost Optimization Prior experience with observability / monitoring / alerting , preferably in Datadog or similar platform Strong proficiency in Linux Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience). 4+ years of professional experience in DevOps or related fields. Strong problem-solving skills with attention to detail. Excellent communication and teamwork abilities. Why Join Us? Opportunity to work with cutting-edge technologies. Opportunity to join a dynamic, growing organization. Collaborative and innovative work environment. Competitive salary and benefits. Professional development opportunities.

Senior Data Engineer Ahmedabad,Gujarat,India 0 years None Not disclosed On-site Full Time

Role: As a Senior Data Engineer, you will be instrumental in designing, building, and maintaining scalable data infrastructure using modern big data technologies. You will work closely with cross-functional teams including data scientists, analysts, and business stakeholders to ensure high-quality data availability and usability. This role requires strong technical expertise, a collaborative mindset, and a problem-solving attitude. Responsibilities: Build and optimize data pipelines using Databricks, SQL, and Apache Spark. Design and implement scalable and reliable data processing systems. Develop and maintain robust ETL/ELT workflows for large-scale data sets. Manage and monitor data pipelines to ensure performance and efficiency. Ensure high standards of data quality, consistency, and integrity. Collaborate with team members to solve complex technical challenges, including pair programming. Participate in discussions to clarify requirements with business and technical stakeholders. Contribute to process and infrastructure improvements through automation and innovation. Qualifications: Proven expertise in Databricks, SQL, and Apache Spark. Strong SQL skills for data querying, transformation, and optimization. Experience in data modeling and building ETL/ELT processes. Proficiency in Python for data scripting and workflow orchestration. Hands-on experience with AWS (especially S3, ECS) and cloud-native data engineering. Familiarity with Linux/Ubuntu operating systems. Experience with analytics engineering principles. Exposure to dbt (Data Build Tool) for transformation management. Understanding of machine learning-driven data workflows is a plus.

Senior Data Engineer ETL-GO-Databricks Ahmedabad,Gujarat,India 5 years None Not disclosed On-site Full Time

Role: We are looking for a Senior Data Engineer to join a high-impact project focused on transforming how companies optimize and evaluate their promotional activities. In this role, you will be a key contributor to a cutting-edge platform that: Collects and validates large volumes of data, Analyzes promotion effectiveness using ML, Automates promotional calendar planning, Seamlessly integrates with enterprise systems, Enables better vendor collaboration and data-driven deal negotiations. You will work with a modern data stack (Scala, Go, Databricks, Kubernetes, Docker) and play a critical role in designing and scaling ETL pipelines, backend services, and cloud-native infrastructure to support the platform’s data science and analytics needs Responsibilities: Design, develop, and maintain scalable distributed systems using Go and/or Scala Build robust ETL pipelines and data workflows using Databricks Handle containerization and orchestration using Docker and Kubernetes Manage data storage and retrieval through PostgreSQL and Elasticsearch Deploy, monitor, and optimize solutions in Microsoft Azure Participate in code reviews, pair programming, and requirement clarification sessions with stakeholders Ensure data quality, performance, and system scalability throughout the pipeline lifecycle Collaborate with cross-functional team members to deliver end-to-end solutions Qualifications: 5+ years of professional experience in data engineering or backend development Strong proficiency in Scala and/or Go Hands-on experience with Databricks and building scalable ETL pipelines Experience with Docker and Kubernetes for microservices orchestration Proficiency in PostgreSQL and Elasticsearch Proven experience deploying solutions in the Azure Cloud (Nice to have) Experience in Python

Senior Software Engineer- Java Pune,Maharashtra,India 0 years None Not disclosed On-site Full Time

Role: We are seeking a highly skilled Senior Backend Java Developer with 4 or more years of hands-on experience in designing, developing, and maintaining robust backend systems. The ideal candidate will have strong expertise in Java, Spring, SQL, and unit testing frameworks such as JUnit or Mockito. Familiarity with AI-assisted development tools (like Claude Code, Cursor, GitHub Copilot) is essential. Experience with DevOps practices and cloud platforms like AWS is a plus. Responsibilities: Design, build, and maintain scalable and secure backend services using Java and the Spring framework. Develop and execute unit and integration tests using JUnit, Mockito, or equivalent frameworks. Collaborate with frontend engineers, DevOps, and cross-functional teams to deliver complete and reliable features. Write and optimize SQL queries and manage relational databases to ensure high-performance data operations. Leverage AI-assisted coding tools (e.g., Claude, Cursor, GitHub Copilot) to boost productivity and maintain code quality. Participate in code reviews, ensure adherence to best practices, and mentor junior developers as needed. Troubleshoot, diagnose, and resolve complex issues in production and staging environments. Contribute to technical documentation, architecture discussions, and Agile development processes (e.g., sprint planning, retrospectives). Qualifications : Strong proficiency in Java and object-oriented programming concepts. Hands-on experience with Spring / Spring Boot for building RESTful APIs and backend services. Proficiency in testing frameworks such as JUnit, Mockito, or equivalent. Solid experience in writing and optimizing SQL queries for relational databases (e.g., PostgreSQL, MySQL). Experience using AI-assisted coding tools (e.g.,ClaudeCode,Cursor, GitHub Copilot) in a production environment. Understanding of DevOps tools and practices (CI/CD, Docker, etc.) Experience with AWS services (e.g., EC2, RDS, S3, Lambda) Exposure to containerization and cloud-native development

Quality Assurance Automation Engineer hyderabad,telangana,india 4 years None Not disclosed On-site Full Time

Type, Location, Time Zone Hybrid @ Hyderabad, Telangana Desired Experience 4+ years Job Description Area Nabis is the #1 Licensed Cannabis Wholesale Platform in the world with the largest portfolio of cannabis brands, supplying hundreds of brands to retailers across California, New York, and Nevada. Our mission is to empower the world to discover cannabis by providing choice, access, and innovation. We’re at the forefront of this movement and are building an innovative technology-first platform to scale the entirety of the cannabis industry. Through dedication to enhancing efficiency, transparency, and customer satisfaction, Nabis is paving the way for sweeping legalization. Role As a Senior QA Automation Engineer, you will design, implement, and maintain automation frameworks and test scripts to ensure the quality and reliability of our software products. You will work closely with developers, product managers, and fellow QA engineers to identify test requirements, develop comprehensive test plans, and drive automation efforts across our platform. The ideal candidate possesses a strong programming background, enjoys solving complex problems with simple solutions, and is committed to driving best practices in quality assurance and automation. Responsibilities Maintain and enhance existing Test Automation Frameworks Design targeted testing strategies for features being developed and automate them Create comprehensive test plans, execute and automate them Collect and report quality metrics from test execution Develop and execute end-to-end test cases for multi-tier web applications Write detailed bug reports and verify fixes Provide recommendations to improve product reliability, performance, and QA processes Mentor and cross-train team members on testing approaches Collaborate with product and engineering teams to align testing efforts with product goals Perform additional duties as needed Qualifications 4+ years of experience in Software Quality Assurance 3+ years of hands-on experience with Playwright Experience with functional testing of multi-tier web applications Strong knowledge of Jira for bug tracking Skilled in developing and executing end-to-end test cases Knowledge of SQL, JSON, API testing, and Postman Experience working independently with minimal supervision Strong communication and organization skills Experience with Performance Testing tools Experience in Security testing (preferred) Solid understanding of large-scale systems testing Passion for testing and solving complex problems with simple solutions