Job Summary: We’re looking for a seasoned Automation Engineer with 7+ years of experience in automation, scripting, and cloud-based systems. The ideal candidate will have strong proficiency in Python , AWS , and CI/CD automation, with exposure to Spring Boot applications. Responsibilities: Develop and maintain automation scripts using Python and PyTest for deployment, testing, and monitoring. Automate infrastructure and application deployments on AWS using Terraform or CloudFormation. Build and manage CI/CD pipelines using GitLab CI . Collaborate with developers to automate testing and deployment of Spring Boot microservices. Monitor and troubleshoot cloud environments (CloudWatch, CloudTrail). Requirements: 7+ years in automation, DevOps, or cloud engineering. Strong Python scripting and experience with PyTest . Hands-on experience with AWS (EC2, Lambda, S3, IAM, etc.). Proficiency with GitLab , Git, and CI/CD pipelines. Experience with Spring Boot applications in deployment/testing contexts. Familiarity with Docker , Kubernetes (EKS), and Infrastructure as Code (Terraform/CloudFormation). Preferred: AWS certification (DevOps/Solutions Architect). Experience in Agile/Scrum environments. Show more Show less
About the Role: We are looking for a Senior Automation Engineer with strong experience in Python , Pytest , TestNG , GitLab CI/CD , and AWS . You will be responsible for building robust test frameworks, integrating automation into CI/CD pipelines, and ensuring high product quality in a cloud-native environment. Responsibilities: Develop and maintain automated test suites using Python, Pytest, TestNG Integrate automation with GitLab pipelines for CI/CD Test backend APIs and cloud-based services on AWS Collaborate with QA, DevOps, and development teams Troubleshoot test failures and ensure high test coverage Required Skills: 7+ years in test automation Proficient in Python , Pytest , TestNG Experience with GitLab CI/CD Good knowledge of AWS services (EC2, S3, Lambda, etc.) Strong API testing and debugging skills Nice to Have: Docker/Kubernetes familiarity Performance testing tools (JMeter, Locust) AWS certification Show more Show less
Role: AWS Data Engineer Experience: 12+ years JD for AWS (DE) Experience in EMR Knowledge of Python Programing language Knowledge of Data Processing using Pandas Library Extensive knowledge of processing csv, excel, json, yaml files using python Good knowledge of Big data technologies : Pyspark, Hadoop, Hive Knowledge of AWS services: S3, Lambda, Redshift, Glue Hands on experience on Apache Airflow for building workflows Knowledge of building ETL and Data pipelines Show more Show less
Snowflake Developer (Overall 7-9 years) 3–4 years of hands-on experience in Snowflake data warehouse development Performance tuning, and database process optimization. Snowflake certified with a solid understanding of security, roles, warehouses, and best practices. Certification is required Strong knowledge of ELT/ETL tools with practical exposure to Alteryx for data preparation and transformation. Show more Show less
Snowflake Developer (Overall 7-9 years) 3–4 years of hands-on experience in Snowflake data warehouse development Performance tuning, and database process optimization. Snowflake certified with a solid understanding of security, roles, warehouses, and best practices. Certification is required Strong knowledge of ELT/ETL tools with practical exposure to Alteryx for data preparation and transformation. Show more Show less
Core data engineering tools and frameworks like Azure, Snowflake, DBT, Python, SQL, Git Solid data modelling and data management fundamentals at all levels to drive the ETL development lifecycle at all layers (conceptual, logical, physical) Git versioning practices for CI/CD DevOps alignment The ability to operate independently, take ownership, and provide mentorship within the team Show more Show less
Salesforce Marketing Cloud and Pardot Architect We are seeking a highly skilled Salesforce Marketing Cloud and Pardot Architect to lead the migration from Pardot (Marketing Cloud Account Engagement) to Salesforce Marketing Cloud (SFMC). The ideal candidate will have deep expertise in Marketing Cloud, Pardot, data architecture, integration strategies, and automation. This role requires a strategic thinker who can design scalable solutions while ensuring seamless data migration, process optimization, and business continuity. Key Responsibilities: Lead the end-to-end migration from Pardot to Salesforce Marketing Cloud, including data migration, campaign transition, and automation redesign. Assess current Pardot configurations and identify gaps, dependencies, and improvements for the new SFMC architecture. Define data models and mapping strategies for customer journeys, segmentation, and engagement tracking. Develop and optimize email, SMS, and push notification campaigns using SFMC Journey Builder, Automation Studio, and Contact Builder. Ensure seamless integration between Salesforce CRM and SFMC, leveraging APIs, Marketing Cloud Connect, and Data Cloud where applicable. Design scalable automation workflows to enhance marketing efficiency and improve customer experience. Implement robust data governance, compliance, and security best practices in accordance with GDPR, CCPA, and other relevant regulations. Work closely with stakeholders (marketing, sales, IT) to ensure business requirements are met and adoption is smooth. Train internal teams and provide documentation on SFMC best practices, campaign execution, and reporting. Monitor and optimize marketing performance using Datorama, Analytics Builder, and custom dashboards. Required Skills & Qualifications: 9+ years of experience in Marketing Automation, CRM, and Salesforce ecosystem. Proven expertise in Salesforce Marketing Cloud (SFMC) and Pardot (Marketing Cloud Account Engagement). Strong experience in migration projects, particularly moving from Pardot to SFMC. Hands-on proficiency in SFMC modules: Email Studio, Journey Builder, Automation Studio, Contact Builder, Mobile Studio, and Advertising Studio. Expertise in API integrations, SQL, SSJS, and AMP script for dynamic content and automation. Experience with Marketing Cloud Connect, Data Cloud, and Salesforce CRM integration. Solid knowledge of customer segmentation, personalization, and lead nurturing strategies Strong analytical skills with experience in campaign performance tracking and optimization. Salesforce certifications such as Marketing Cloud Consultant, Pardot Specialist, or Data Cloud Certification are a plus. Preferred Qualifications: Experience in multi-cloud solutions, integrating SFMC with Service Cloud, Commerce Cloud, or Data Cloud. Familiarity with third-party integrations, CDPs, or advanced data modelling. Prior experience in B2B and B2C marketing strategies using Salesforce platforms.
Job Title: ServiceNow Architect We are seeking an experienced and strategic ServiceNow Architect to lead the design and governance of our ServiceNow platform, with a primary focus on IT Service Management (ITSM) and Hardware/Software Asset Management (HAM/SAM). The ideal candidate will have a strong background in financial services, with deep knowledge of regulatory, compliance, and operational requirements unique to the industry. As a ServiceNow Architect, you will be responsible for translating business objectives into scalable, compliant, and efficient ServiceNow solutions across the ITSM and Asset Management domains, driving platform strategy, design standards, and delivery excellence. Key Responsibilities: Own the architecture and solution design for ServiceNow ITSM, HAM, and SAM modules, ensuring alignment with business goals and financial industry standards. Define and enforce platform governance, architecture principles, and integration patterns. Provide architectural oversight for all ServiceNow development, ensuring scalable and maintainable design. Partner with cross-functional teams to identify opportunities for automation, optimization, and improved service delivery. Lead the platform roadmap and upgrade planning, ensuring minimal business impact and high platform stability. Collaborate with security and compliance teams to ensure alignment with data protection, audit, and regulatory requirements (e.g., SOX, PCI, GDPR). Evaluate and design integrations with third-party systems, CMDB sources, discovery tools, and financial asset tracking systems. Act as the subject matter expert (SME) for ServiceNow architecture, mentoring development teams and reviewing code/configuration. Monitor new releases from ServiceNow and recommend relevant new features or modules to stakeholders. Lead architectural reviews, POCs, and performance assessments to ensure high availability and optimized system design. Required Skills and Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. 8+ years of ServiceNow experience, including 3+ years in a ServiceNow architect or lead developer role. Deep expertise in ITSM, HAM, and SAM module design, configuration, and deployment. Strong industry experience in financial services, with understanding of operational risk, compliance, and governance. Hands-on experience with ServiceNow scripting (JavaScript, Glide API), Flow Designer, IntegrationHub, and REST/SOAP APIs. Extensive knowledge of CMDB design, discovery processes, and asset lifecycle governance. Experience with enterprise architecture frameworks (e.g., TOGAF, ITIL v4). Strong interpersonal and stakeholder management skills, including executive-level communication. Solid understanding of Agile, DevOps, CI/CD pipelines, and platform performance optimization. Preferred Certifications: ServiceNow Certified System Administrator (CSA) – Required ServiceNow Certified Implementation Specialist in ITSM and HAM/SAM – Highly Preferred Certified Application Developer (CAD) TOGAF, ITIL v4, or similar enterprise architecture/IT service management certifications
Lead Data Engineer (Remote for India only) Strong hands-on expertise in SQL, DBT and Python for data processing and transformation. Expertise in Azure data services (e.g., Azure Data Factory, Synapse, Event Hub) and orchestration tools. Strong experience with Snowflake – including schema design, performance tuning, and security model. Good understanding of DBT for transformation layer and modular pipeline design. Hands-on with Git and version control practices – branching, pull requests, code reviews. Understanding of DevOps/DataOps principles – CI/CD for data pipelines, testing, monitoring. Knowledge of data modeling techniques – Star schema, Data Vault, Normalization/Denormalization. Experience with real-time data processing architectures is a strong plus. Proven leadership experience – should be able to mentor team members, take ownership, make design decisions independently. Strong sense of ownership, accountability, and solution-oriented mindset. Ability to handle ambiguity and work independently with minimal supervision. Clear and confident communication (written and verbal) – must be able to represent design and architecture decisions. Lead the design and development of data pipelines (batch and real-time) using modern cloud-native technologies (Azure, Snowflake, DBT, Python). Translate business and data requirements into scalable data integration designs. Guide and review development work across data engineering team members (onshore and offshore). Define and enforce best practices for coding, testing, version control, CI/CD, data quality, and pipeline monitoring. Collaborate with data analysts, architects, and business stakeholders to ensure data solutions are aligned with business goals. Own and drive end-to-end data engineering workstreams – from design to production deployment and support. Provide architectural and technical guidance on platform setup, performance tuning, cost optimization, and data security. Drive data engineering standards and reusable patterns across projects to ensure scalability, maintainability, and reusability of code and data assets. Define and oversee data quality frameworks to proactively detect, report, and resolve data issues across ingestion, transformation, and consumption layers. Act as a technical go-to team member for complex design, performance, or integration issues across multiple teams and tools (e.g., DBT + Snowflake + Azure pipelines). Contribute to hand on development as well for the ned to end integration pipelines and workflows. Document using Excel, Word, or tools like Confluence.
Job Title: Senior QA Automation Engineer (Selenium + Java) Location: Remote Experience: 8+ Years | Type: Full-time Job Overview: We are looking for an experienced QA Automation Engineer with strong skills in Selenium and Java to design, build, and maintain test automation frameworks and drive quality in our agile development process. Key Responsibilities: Develop and maintain automated test scripts using Selenium WebDriver + Java Create test frameworks using TestNG/JUnit, Maven/Gradle Collaborate with developers, QA, and product teams Perform functional, regression, UI, and API testing Integrate automation into CI/CD pipelines (e.g., Jenkins, GitHub Actions) Requirements: 8+ years in QA Automation with strong Java + Selenium experience Hands-on with TestNG, Maven/Gradle, Git, REST API testing Familiar with CI/CD tools, BDD (e.g., Cucumber) a plus Good communication and leadership skills Nice to Have: Experience with JMeter, Docker, cloud testing tools (e.g., BrowserStack) ISTQB certification
Job Title: Data Engineer – AWS & Financial Data Reconciliation Location: Ahmedabad (Onsite) Experience: 3+ Years Job Type: Full-Time About the Role: We are looking for a skilled Data Engineer with experience in AWS cloud platforms and strong knowledge of financial data reconciliation processes . This is a key onsite role in our Ahmedabad office , responsible for building scalable data solutions and ensuring the integrity and accuracy of financial data across multiple systems. Key Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines using AWS services (e.g., AWS Glue, Lambda, Redshift, S3, Athena). Reconcile large datasets from multiple financial systems, ensuring data accuracy , completeness , and auditability . Automate data validation and reconciliation processes to support finance, accounting, and compliance teams. Collaborate with cross-functional teams (finance, data analytics, business intelligence) to gather requirements and implement data solutions. Create and maintain data dictionaries , lineage documentation , and audit trails for financial datasets. Monitor pipeline performance and troubleshoot data quality or processing issues. Implement data governance best practices and support regulatory and internal compliance needs. Required Skills & Qualifications: Minimum 3 years of experience in data engineering or similar roles. Proven experience working with financial data and data reconciliation (e.g., transactional data, ledger entries, settlements, P&L, balance sheets). Strong experience with AWS services : S3, Glue, Lambda, Redshift, Athena, CloudWatch, Step Functions. Strong in SQL and scripting languages such as Python or Scala . Experience building and automating data validation and reconciliation tools or processes. Familiarity with data warehousing concepts and data lake architecture . Experience working with version control systems like Git and CI/CD pipelines. Bachelor’s degree in Computer Science, Information Systems, Finance, or related field. Nice to Have: Understanding of accounting principles and finance domain KPIs. Experience with data visualization and BI tools (e.g., Power BI, Tableau, AWS QuickSight). Knowledge of data quality frameworks , audits, and controls. Exposure to real-time data streaming platforms (e.g., Kafka, Kinesis). Experience with infrastructure-as-code tools (Terraform, CloudFormation).
Job Summary: We are seeking an experienced Salesforce Marketing Cloud (SFMC) Architect to lead the design, architecture, and implementation of enterprise-grade marketing automation solutions. You will play a pivotal role in defining scalable digital marketing strategies and integrating SFMC with CRM, analytics, and external platforms to drive personalized, data-driven customer experiences. Key Responsibilities: Serve as the technical lead and solution architect for SFMC implementations, guiding cross-functional teams through the full lifecycle. Design end-to-end multi-channel campaign architectures (Email, SMS, Push, Web, Advertising Studio, etc.) using SFMC tools like Journey Builder, Automation Studio, and Interaction Studio. Architect integrations with Salesforce Sales/Service Cloud , CDPs, and external platforms using APIs, webhooks, and data extensions. Define data models and ETL workflows for subscriber and contact data using Data Designer, Contact Builder, and SQL in SFMC. Implement dynamic content personalization , segmentation logic, and advanced AMPscript/SSJS scripting for targeted communications. Set up data feeds and automations to synchronize engagement and behavioral data into and out of the SFMC ecosystem. Define governance frameworks : user roles, permissions, data retention, compliance (GDPR, CCPA), and security best practices. Lead technical discussions with client stakeholders, ensuring alignment of business requirements and scalable architecture. Conduct code reviews, documentation , and architectural assessments for performance, scalability, and maintainability. Stay up to date with the Salesforce ecosystem and recommend tools, accelerators, and best practices for marketing effectiveness. Required Skills & Experience: 4+ years of experience in Marketing Automation / CRM / Martech, with 3+ years in SFMC architecture and implementations. Deep expertise across SFMC modules : Email Studio, Journey Builder, Automation Studio, Contact Builder, Audience Builder, Mobile Studio, and Advertising Studio. Strong experience with AMPscript, SSJS , SQL, and scripting-based personalization logic. Proficiency in API integrations (REST and SOAP), data imports/exports, and platform interoperability. Hands-on experience with Salesforce CRM integration and Marketing Cloud Connect . Experience designing multi-org/multi-BU SFMC setups with data partitioning and tenant governance. Knowledge of CDPs, customer journeys, real-time triggers , and data orchestration tools (like Segment, mParticle, or Adobe RT-CDP). Familiarity with compliance regulations (CAN-SPAM, GDPR, CASL) in email and SMS marketing. Experience with Agile project delivery , documentation, and stakeholder collaboration. Preferred Qualifications: Salesforce Certified Marketing Cloud Consultant or Developer Experience with Interaction Studio / Personalization and Einstein features Experience working with data lakes, DMPs , or external identity resolution tools Understanding of DevOps practices , version control (e.g., Git), CI/CD pipelines in the context of SFMC assets
Job Title: Senior ServiceNow Developer – ITOM & Discovery Experience: 6+ Years Location: Remote Job Description: We are seeking a highly skilled and experienced Senior ServiceNow Developer with a strong background in IT Operations Management (ITOM) , particularly in Cloud, Pattern, and Network Discovery . The ideal candidate will have deep hands-on expertise in troubleshooting discovery issues, building and customizing Service Portals, and integrating ServiceNow with external systems. Key Responsibilities: Perform Cloud, Pattern, and Network Discoveries using ServiceNow Discovery tools. Troubleshoot and resolve issues related to discovery patterns and network probes. Develop and customize ServiceNow Service Portal components as per client requirements. Design and implement integrations between ServiceNow and third-party systems using REST, SOAP, MID Servers, etc. Write clean, efficient, and scalable code aligned with ServiceNow development best practices. Collaborate with cross-functional teams to understand business needs and translate them into technical solutions. Participate in technical design reviews, code reviews, and contribute to continuous improvement of processes. Required Skills: 6+ years of hands-on experience in ServiceNow development. Strong experience with ITOM modules, especially Discovery and CMDB . In-depth knowledge of Pattern Design , Troubleshooting , and MID Server configuration . Proficiency in JavaScript , GlideScript , and ServiceNow Scripting APIs . Experience in Service Portal development and UI customization . Proven ability to build and manage API integrations (REST, SOAP). Excellent problem-solving skills and attention to detail. Strong communication and stakeholder management abilities. Preferred Qualifications: ServiceNow Certified Implementation Specialist – ITOM ServiceNow Certified Application Developer Prior experience working in agile environments
Job Title: GCP Data Engineer Location : [Remote] Experience Level : 10+ years Job Summary: We are seeking a skilled GCP Data Engineer to join our cloud data team. The ideal candidate will have deep experience with data pipeline development, data lake/warehouse solutions, and data transformation using GCP-native tools. Key Responsibilities: Design, build, and manage scalable data pipelines on Google Cloud Platform. Develop ETL/ELT processes using Cloud Dataflow, Apache Beam, and BigQuery. Ensure high data quality, integrity, and security. Implement data ingestion from various sources (streaming and batch). Collaborate with data scientists and analysts to support business insights. Optimize performance and cost of data pipelines and queries. Required Qualifications: Bachelor's in Computer Science, Engineering, or a related field. Strong experience in GCP services: BigQuery, Cloud Storage, Dataflow, Pub/Sub, Cloud Composer, etc. Strong SQL and Python programming skills. Hands-on experience with data modeling, warehousing, and pipeline orchestration. Familiarity with Apache Beam or similar data processing frameworks. Preferred Skills: GCP Professional Data Engineer certification. Experience with Terraform or Deployment Manager for infrastructure as code. Exposure to machine learning workflows on GCP. 📌 Job Title: GCP Developer Location : [ Remote] Experience Level : 10+ years Job Summary: We are looking for a GCP Developer to create and maintain applications on Google Cloud. You’ll work closely with architects and DevOps teams to build scalable, secure, and cloud-native solutions. Key Responsibilities: Design and implement microservices using GCP-native tools and APIs. Build serverless applications using Cloud Functions, App Engine, or Cloud Run. Integrate with GCP services such as Pub/Sub, Firestore, and Cloud Storage. Monitor, debug, and optimize application performance on GCP. Participate in CI/CD pipeline implementation and cloud automation. Required Qualifications: Proficiency in at least one programming language (Python, Java, Go, or Node.js). Experience developing applications on Google Cloud. Strong understanding of GCP services and APIs. Familiarity with RESTful APIs, containers (Docker), and Kubernetes (GKE). Preferred Skills: GCP Associate Cloud Engineer or Professional Cloud Developer certification. Experience with Cloud Build, Cloud Monitoring, and IAM configuration. Background in agile methodologies and DevOps culture.
Job Title: Teamcenter Developer / Architect Location: Remote Experience Required: 7+ Years Job Summary: We are seeking a highly experienced Teamcenter Developer/Architect to design, develop, and maintain Teamcenter PLM solutions in a distributed, remote environment. The ideal candidate will have a strong background in Teamcenter architecture, customization, and integration, with a focus on scalability, performance, and user experience. Key Responsibilities: Design, develop, and implement scalable Teamcenter PLM solutions to support business processes. Lead the architecture and customization of Teamcenter modules including workflows, BMIDE, ITK, RAC, AWC, and SOA integrations. Collaborate with cross-functional teams including Engineering, Manufacturing, and IT to define PLM requirements and translate them into technical solutions. Develop customizations, extensions, and integrations using Teamcenter APIs (ITK, SOA, RAC, AWC). Optimize and troubleshoot existing Teamcenter implementations for performance and reliability. Guide and mentor junior developers and serve as a technical expert on Teamcenter best practices. Support upgrades, migrations, and environment setup. Produce technical documentation, architecture diagrams, and deployment guides. Required Qualifications: 7+ years of hands-on experience in Teamcenter development and architecture. Strong expertise with Teamcenter Unified Architecture (UA), particularly BMIDE, ITK, SOA, RAC, AWC. Experience with Teamcenter data model design, workflows, and change management processes. Proficient in programming languages like Java, C++, or C#, and scripting (Shell, Perl, or Python). Experience in Teamcenter upgrades, patching, and performance tuning. Familiarity with CAD integrations (NX, CATIA, SolidWorks) is a plus. Strong understanding of PLM concepts, enterprise integrations (ERP, MES), and configuration management. Experience working in Agile/Scrum development environments. Excellent communication and documentation skills. Preferred Qualifications: Siemens certification in Teamcenter. Experience with cloud-based PLM deployments or AWS/Azure. Prior experience in manufacturing, automotive, or aerospace domains.
Job Title: Senior SAP S/4HANA Consultant – HCM / SuccessFactors Location: Remote Experience Required: 8+ Years About the Role: We are seeking a highly skilled and results-driven Senior SAP S/4HANA Consultant with expertise in SAP HCM , SAP SuccessFactors customization , and SAP Cloud Analytics . This is a fully remote opportunity where you'll play a leading role in designing, implementing, and optimizing SAP S/4HANA solutions to support key business functions, particularly in the Human Capital Management (HCM) domain. This role is ideal for professionals who have a deep understanding of SAP’s HCM capabilities and SuccessFactors suite, and who are passionate about driving digital transformation initiatives across global enterprises. Key Responsibilities: Lead full-cycle SAP S/4HANA HCM module implementations and migration projects. Customize and configure SAP HCM modules and SAP SuccessFactors to align with business needs. Work on SAP Cloud Analytics to deliver insights and performance metrics related to HR processes. Collaborate with business stakeholders to gather functional requirements and deliver tailored SAP solutions. Ensure seamless integration between SAP HCM and other modules like FI/CO, SD, MM, etc. Perform system testing, integration testing, and UAT. Provide subject matter expertise and advanced troubleshooting support for HCM-related issues. Prepare detailed documentation, including functional specs and end-user training materials. Keep up to date with the latest SAP innovations and suggest enhancements for continuous improvement. Required Skills & Qualifications: 8+ years of hands-on SAP consulting experience, with a strong focus on SAP HCM . At least 2+ years of experience with SAP S/4HANA implementations. Demonstrated experience in customizing SAP SuccessFactors modules. Experience with SAP Cloud Analytics or SAP Analytics Cloud . Solid understanding of SAP best practices and inter-module integration. Familiarity with SAP Activate methodology and Fit-to-Standard approach. Strong analytical thinking and communication skills. Ability to thrive in a remote, distributed team setup. Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field. Preferred Qualifications: SAP Certification in HCM, SuccessFactors, or S/4HANA modules. Exposure to SAP BTP , embedded analytics, and Fiori apps. Experience with Agile methodologies and DevOps tools. Background in global delivery models or working with international clients.
Job Title: Senior QA Engineer – Tungsten / KTA Experienced Location: Remote Experience Required: 5+ Years Industry: Information Technology / Automation / BPM About the Role: We are seeking a highly skilled and detail-oriented Senior QA Engineer with strong hands-on experience in Tungsten Automation (formerly Kofax TotalAgility - KTA) to join our remote team. The ideal candidate will bring deep QA expertise, solid automation testing knowledge, and a strong understanding of workflow automation platforms, particularly Tungsten/KTA. Key Responsibilities: Design, develop, and execute test plans, test cases, and test scripts for applications built using Tungsten / KTA. Collaborate with development, business analysts, and product teams to understand business requirements and validate deliverables. Perform functional, integration, regression, and performance testing of KTA solutions. Set up and maintain automated testing frameworks and tools. Report, track, and manage defects throughout the software development lifecycle. Participate in design and code reviews from a QA perspective. Identify and recommend QA process improvements. Maintain documentation related to test plans, test cases, and test outcomes. Ensure high standards of quality and timely delivery of releases. Required Skills & Qualifications: 6+ years of overall experience in Quality Assurance or Software Testing. 3+ years of hands-on experience with Tungsten Automation / Kofax TotalAgility (KTA) . Experience in testing complex workflow and document automation solutions. Familiarity with automation testing tools and scripting. Strong knowledge of QA methodologies, tools, and best practices. Good understanding of software development life cycle (SDLC) and Agile practices. Experience working with defect tracking tools like JIRA or Azure DevOps. Excellent problem-solving, analytical, and communication skills. Ability to work independently in a remote setup and manage multiple priorities. Preferred Qualifications: ISTQB or other QA certifications. Experience in testing integrations with ECM, OCR/ICR tools, or RPA platforms. Exposure to CI/CD pipelines and test automation in DevOps environments. Prior experience with remote, distributed teams.