Jobs
Interviews

99 Data Testing Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Design and execute test cases for various application functionalities. Perform functional, regression, and integration testing. Identify, document, and track functional defects. Collaborate with developers to resolve functional issues. Participate in code reviews and provide feedback on testability. Develop and maintain automated functional test scripts. Database Tester: Design and execute test plans and test cases for database functionality, performance, and security. Perform data validation, integrity checks, and data consistency testing. Write and execute SQL queries to verify data accuracy and completeness. Identify, document, and track database-related defects. Collaborate with database administrators and developers to resolve database issues. Develop and maintain automated database test scripts. Monitor database performance and identify potential bottlenecks. General: Contribute to the continuous improvement of testing processes and methodologies. Stay up-to-date with the latest testing technologies and trends. Communicate effectively with team members and stakeholders. Note* Experience of handling\/Testing trading application would be and advantage. Experience of Treasury and Trading Domain.

Posted 2 weeks ago

Apply

0.0 - 3.0 years

2 - 5 Lacs

Thane

Work from Office

Outbound calls for lead generation Present projects to potential clients Schedule coordinate site visits Update CRM follow up with leads

Posted 2 weeks ago

Apply

5.0 - 8.0 years

2 - 5 Lacs

Chennai

Work from Office

Job Title:Data TestersExperience5-8 YearsLocation:CHN/BGL/HYD/Pune, Remote : Expectations Provenexperienceworkingonlargeprojectsinthefinancialservicesorinvestmentmanagementdomain,preferablywithAladdinorsimilarportfoliomanagementsystems. Deepunderstandingofinvestmentmanagementworkflows,includingportfoliomanagement,trading,compliance,andriskmanagement. ExperienceintestingdatavalidationscenariosandDataingestion,pipelines,andtransformationprocesses(e.g.,ETL) ExperienceworkingonSnowflake ProficiencywithAPITestingusingCodecept/Postman&SOAPUI Experienceindatabaseskills(SQLQueries)fordatamigrationtestingandtestdatacreation. Strong knowledge in python packages. ExperienceworkinginAgileEnvironmentandimplementingCI/CDpipelineusingBamboo/GITLabs. ProficiencyonVersionControlSystemslikeBitbucket/SVNisrequired. WorkingandbuildingDataComparison(XML,JSON,Excel,Database)utilitieswouldbeadvantageous. ExposuretoAWS/Azureenvironment(s) Desirable Skills: Deepunderstandingofinvestmentmanagementworkflows,includingportfoliomanagement,trading,compliance,andriskmanagement. Knowledgeofautomationtoolsfordeploymentandconfigurationmanagement. ExperienceworkingoniCEDQ. CertificationssuchasAWS,ISTQB,PMP,orScrumMasterareadvantageous.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

18 - 30 Lacs

Noida, Pune, Bengaluru

Work from Office

Key Responsibilities: Design, develop, and execute test cases to validate data ingestion from source systems (SQL, APIs) to Databricks UAP platform. Perform schema validation, data completeness, transformation, and row-level data checks between source and target. Utilize SQL extensively for data profiling and validation. Leverage PySpark and Pandas for large-scale data comparison and automation. Maintain and enhance the Data Test Automation Framework using Python/PySpark for efficient and scalable data testing. Participate in daily stand-ups, sprint planning, and retrospectives following Agile practices. Manage and track test progress, issues, and risks in JIRA. Ensure adherence to QA documentation practices: Test Plan Test Scenarios & Test Cases Test Summary Reports Defect Reports Own and drive the Defect Life Cycle, working closely with developers and product teams to ensure timely resolution. Collaborate with business analysts and developers to understand data requirements and ensure high test coverage. Required Skills: 4+ years of experience in QA/testing with a strong focus on data validation. Strong proficiency in SQL (Joins, Subqueries, Aggregations, Data Profiling) Experience in testing data pipelines ingesting from SQL and APIs to Databricks. Hands-on with Python and Pandas for data manipulation. Working knowledge of PySpark and familiarity with Spark DataFrames, transformations, and data handling. Knowledge of Agile testing processes and QA best practices. Familiar with QA documentation and reporting standards. Experience with JIRA for test and defect management. Good understanding of Defect Life Cycle and its role in maintaining software quality.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Noida

Work from Office

Experience in Data Testing using SQL Knowledge of Python is a plus Domain expertise in Risk and Finance IT Availability to provide overlap with EST hours (until noon EST) Mandatory Competencies ETL - ETL - Tester Beh - Communication and collaboration QA/QE - QA Manual - Test Case Creation, Execution, Planning, Reporting with risks/dependencies Data Science and Machine Learning - Data Science and Machine Learning - Python.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

6 - 8 Lacs

Kolkata, Hyderabad, Chennai

Work from Office

Job Title : Data Migration Testing Location State : Tamil Nadu,Telangana,West Bengal Location City : Chennai, Hyderabad, Kolkata Experience Required : 6 to 8 Year(s) CTC Range : 6 to 8 LPA Shift: Day Shift Work Mode: Onsite Position Type: C2H Openings: 2 Company Name: VARITE INDIA PRIVATE LIMITED About The Client: Client is an Indian multinational technology company specializing in information technology services and consulting. Headquartered in Mumbai, it is a part of the Tata Group and operates in 150 locations across 46 countries. About The Job: Work Location: Chennai, TN/Hyderabad, TS/ Kolkata, WB Skill Required: Data Migration Testing Experience Range in Required Skills: 6-8 Years ETL Testing Essential Job Functions: Skill Required: Data Migration Testing ETL Testing Qualifications: Skill Required: Data Migration Testing ETL Testing How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

You will be working at AQM Technologies Pvt. Ltd., a company dedicated to providing exceptional testing experiences for all stakeholders. The company, established in 2000, boasts a team of over 1500 professionals, with 75% being ISTQB/ASTQB certified. AQM excels in various testing domains, including Quality Engineering, User Acceptance Testing, and Test Automation. Notably, more than 40% of the workforce is trained in Digital Testing. The company's strong client relationships are evident through their impressive 95% retention rate. AQM is globally recognized and certified as a "Great Place to Work" by the Great Place To Work Institute. As a Lead / STE / TE at AQM Technologies, you will be based in Mumbai in a full-time, on-site role. Your responsibilities will include manual testing of web and mobile applications, creating, executing, uploading, and maintaining test cases and test scenarios on JIRA, identifying, tracking, and reporting defects using JIRA or similar bug-tracking tools, ensuring compliance with banking & financial regulations for secure transactions, and collaborating with developers, business analysts, and product teams in an Agile environment. To excel in this role, you should have skills in Test Automation, Load & Performance Testing, proficiency in Security Testing, Cyber Security Audit, experience with User Acceptance Testing, Mobile Application Testing, Quality Engineering, and Data Testing. Strong analytical and problem-solving abilities, excellent communication and leadership skills, and a Bachelor's degree in computer science, Engineering, or related field are essential. Required skills include proficiency in Manual Testing (Functional, UI, Regression, and End-to-End testing), Bug Tracking & Test Management (JIRA or similar tools), understanding of APIs (hands-on API testing not required), and 2+ years of experience in Web application, mobile testing, API, DB, test design, test execution, defect life cycle, testing life cycle, Functional testing, and defect tools. Experience in using JIRA for test design and test execution status is necessary. Basic knowledge in Figma design for web and mobile journeys is a plus.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Looking for a Data tester with DBT (Data built tool) experience for Core conversion project. Offshore ETL Tester with knowledge on Power BI, DBT Labs. Very good in SQL concepts and Query writing Experience of writing simple procedures using T-SQL Hands on experience ETL Testing Knowledge on Power BI, DBT Labs. Data loads/intakes/extracts and incremental loads testing Good Manual Testing / UI testing Ability to design test cases from requirement and Test planning Very good communication and coordination skills Knowledge of Banking domain / Agency management is added advantage

Posted 3 weeks ago

Apply

4.0 - 8.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Project description We are more than a diversified global financial markets infrastructure and data business. We are dedicated, open-access partners with a commitment to excellence in delivering the services our customers expect from us. With extensive experience, deep knowledge and worldwide presence across financial markets, we enable businesses and economies around the world to fund innovation, manage risk and create jobs. It's how we've contributed to supporting the financial stability and growth of communities and economies globally for more than 300 years. Through a comprehensive suite of trusted financial market infrastructure services - and our open-access model - we provide the flexibility, stability and trust that enable our customers to pursue their ambitions with confidence and clarity. We are headquartered in the United Kingdom, with significant operations in 70 countries across EMEA, North America, Latin America and Asia Pacific. We employ 25,000 people globally, more than half located in Asia Pacific. Responsibilities As a Senior Quality Assurance Engineer, you will be responsible for ensuring the quality and reliability of complex data-driven systems, with a focus on financial services applications. You will work closely with Data Engineers, Business Analysts, and Developers across global teams to validate functionality, accuracy, and performance of software solutions, particularly around data migration from on-premises to cloud platforms. Key responsibilities include Leading and executing end-to-end test plans, including functional, unit, regression, and back-to-back testing Designing test strategies for data migration projects, with a strong focus on Oracle to Cloud transitions Verifying data accuracy and transformation logic across multiple environments Writing Python-based automated test scripts and utilities for validation Participating in Agile ceremonies, collaborating closely with cross-functional teams Proactively identifying and documenting defects, inconsistencies, and process improvements Contributing to continuous testing and integration practices Ensuring traceability between requirements, test cases, and delivered code Skills Must have Mandatory Skills Description The ideal candidate must demonstrate strong experience ( minimum 7 Years) and hands-on expertise in the following areas Data Testing (Oracle to Cloud Migration)Deep understanding of testing strategies related to large-scale data movement and transformation validation between legacy on-premise systems and modern cloud platforms. Python ScriptingProficient in using Python for writing automated test scripts and tools to streamline testing processes. Regression TestingProven ability to develop and manage comprehensive regression test suites ensuring consistent software performance over releases. Back-to-Back TestingExperience in comparing results between old and new systems or components to validate data integrity post-migration. Functional TestingSkilled in verifying system behavior against functional requirements in a business-critical environment. Unit TestingCapable of writing and executing unit tests for small code components to ensure correctness at the foundational level. Nice to have While not required, the following skills would be a strong plus and would enhance your effectiveness in the role Advanced Python DevelopmentExperience in building complex QA tools or contributing to CI/CD pipelines using Python. DBT (Data Build Tool)Familiarity with DBT for transformation testing and documentation in data engineering workflows. SnowflakeExposure to Snowflake cloud data warehouse and understanding of its testing and validation mechanisms.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Hyderabad

Hybrid

Job Summary: We are looking for a Quality Engineer Data to ensure the reliability, accuracy, and performance of data pipelines and AI/ML models in our SmartFM platform. This role is essential for delivering trustworthy data and actionable insights to optimize smart building operations. Roles and Responsibilities: Design and implement QA strategies for data pipelines and ML models. Test data ingestion and streaming systems (StreamSets, Kafka) for accuracy and completeness. Validate data stored in MongoDB, ensuring schema and data integrity. Collaborate with Data Engineers to proactively address data quality issues. Work with Data Scientists to test and validate ML/DL/LLM/Agentic Workflow models. Automate data validation and model testing using tools like Pytest, Great Expectations, Deepchecks. Monitor production pipelines for data drift, model degradation, and performance issues. Participate in code reviews and create detailed QA documentation. Continuously improve QA processes based on industry best practices. Required Technical Skills: 5 - 10 years of experience in QA, with focus on Data and ML testing. Proficient in SQL for complex data validation. Hands-on with StreamSets, Kafka, and MongoDB. Python scripting for test automation. Familiarity with ML model testing, metrics, and bias detection. Experience with cloud platforms (Azure, AWS, or GCP). Understanding of Node.js and React-based systems is a plus. Experience with QA tools like Pytest, Great Expectations, Deepchecks. Additional Qualifications: Excellent communication and documentation skills. Strong analytical mindset and attention to detail. Experience working cross-functionally with Engineers, Scientists, and Product teams. Passion for learning new technologies and QA frameworks. Domain knowledge in facility management, IoT, or building automation is a plus.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

The ideal candidate for the Azure Data Engineer position will have 4-6 years of experience in designing, implementing, and maintaining data solutions on Microsoft Azure. As an Azure Data Engineer at our organization, you will be responsible for designing efficient data architecture diagrams, developing and maintaining data models, and ensuring data integrity, quality, and security. You will also work on data processing, data integration, and building data pipelines to support various business needs. Your role will involve collaborating with product and project leaders to translate data needs into actionable projects, providing technical expertise on data warehousing and data modeling, as well as mentoring other developers to ensure compliance with company policies and best practices. You will be expected to maintain documentation, contribute to the company's knowledge database, and actively participate in team collaboration and problem-solving activities. We are looking for a candidate with a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a Data Engineer focusing on Microsoft Azure. Proficiency in SQL and experience with Azure data services such as Azure Data Factory, Azure SQL Database, Azure Databricks, and Azure Synapse Analytics is required. Strong understanding of data architecture, data modeling, data integration, ETL/ELT processes, and data security standards is essential. Excellent problem-solving, collaboration, and communication skills are also important for this role. As part of our team, you will have the opportunity to work on exciting projects across various industries like High-Tech, communication, media, healthcare, retail, and telecom. We offer a collaborative environment where you can expand your skills by working with a diverse team of talented individuals. GlobalLogic prioritizes work-life balance and provides professional development opportunities, excellent benefits, and fun perks for its employees. Join us at GlobalLogic, a leader in digital engineering, where we help brands design and build innovative products, platforms, and digital experiences for the modern world. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers worldwide, serving customers in industries such as automotive, communications, financial services, healthcare, manufacturing, media and entertainment, semiconductor, and technology.,

Posted 3 weeks ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Gurugram

Work from Office

LocationBangalore/Hyderabad/Pune Experience level8+ Years About the Role We are looking for a technical and hands-on Lead Data Engineer to help drive the modernization of our data transformation workflows. We currently rely on legacy SQL scripts orchestrated via Airflow, and we are transitioning to a modular, scalable, CI/CD-driven DBT-based data platform. The ideal candidate has deep experience with DBT , modern data stack design , and has previously led similar migrations improving code quality, lineage visibility, performance, and engineering best practices. Key Responsibilities Lead the migration of legacy SQL-based ETL logic to DBT-based transformations Design and implement a scalable, modular DBT architecture (models, macros, packages) Audit and refactor legacy SQL for clarity, efficiency, and modularity Improve CI/CD pipelines for DBTautomated testing, deployment, and code quality enforcement Collaborate with data analysts, platform engineers, and business stakeholders to understand current gaps and define future data pipelines Own Airflow orchestration redesign where needed (e.g., DBT Cloud/API hooks or airflow-dbt integration) Define and enforce coding standards, review processes, and documentation practices Coach junior data engineers on DBT and SQL best practices Provide lineage and impact analysis improvements using DBTs built-in tools and metadata Must-Have Qualifications 8+ years of experience in data engineering Proven success in migrating legacy SQL to DBT , with visible results Deep understanding of DBT best practices , including model layering, Jinja templating, testing, and packages Proficient in SQL performance tuning , modular SQL design, and query optimization Experience with Airflow (Composer, MWAA), including DAG refactoring and task orchestration Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery etc.) Familiarity with data testing and CI/CD for analytics workflows Strong communication and leadership skills; comfortable working cross-functionally Nice-to-Have Experience with DBT Cloud or DBT Core integrations with Airflow Familiarity with data governance and lineage tools (e.g., dbt docs, Alation) Exposure to Python (for custom Airflow operators/macros or utilities) Previous experience mentoring teams through modern data stack transitions

Posted 3 weeks ago

Apply

6.0 - 10.0 years

6 - 10 Lacs

Greater Noida

Work from Office

SQL DEVELOPER: Design and implement relational database structures optimized for performance and scalability. Develop and maintain complex SQL queries, stored procedures, triggers, and functions. Optimize database performance through indexing, query tuning, and regular maintenance. Ensure data integrity, consistency, and security across multiple environments. Collaborate with cross-functional teams to integrate SQL databases with applications and reporting tools. Develop and manage ETL (Extract, Transform, Load) processes for data ingestion and transformation. Monitor and troubleshoot database performance issues. Automate routine database tasks using scripts and tools. Document database architecture, processes, and procedures for future reference. Stay updated with the latest SQL best practices and database technologies.Data Retrieval: SQL Developers must be able to query large and complex databases to extract relevant data for analysis or reporting. Data Transformation: They often clean, join, and reshape data using SQL to prepare it for downstream processes like analytics or machine learning. Performance Optimization: Writing queries that run efficiently is key, especially when dealing with big data or real-time systems. Understanding of Database Schemas: Knowing how tables relate and how to navigate normalized or denormalized structures is essential. QE: Design, develop, and execute test plans and test cases for data pipelines, ETL processes, and data platforms. Validate data quality, integrity, and consistency across various data sources and destinations. Automate data validation and testing using tools such as PyTest, Great Expectations, or custom Python/SQL scripts. Collaborate with data engineers, analysts, and product managers to understand data requirements and ensure test coverage. Monitor data pipelines and proactively identify data quality issues or anomalies. Contribute to the development of data quality frameworks and best practices. Participate in code reviews and provide feedback on data quality and testability. Strong SQL skills and experience with large-scale data sets. Proficiency in Python or another scripting language for test automation. Experience with data testing tools Familiarity with cloud platforms and data warehousing solutions

Posted 3 weeks ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Bengaluru

Work from Office

Location: Bangalore/Hyderabad/Pune Experience level: 8+ Years About the Role We are looking for a technical and hands-on Lead Data Engineer to help drive the modernization of our data transformation workflows. We currently rely on legacy SQL scripts orchestrated via Airflow, and we are transitioning to a modular, scalable, CI/CD-driven DBT-based data platform. The ideal candidate has deep experience with DBT , modern data stack design , and has previously led similar migrations improving code quality, lineage visibility, performance, and engineering best practices. Key Responsibilities Lead the migration of legacy SQL-based ETL logic to DBT-based transformations Design and implement a scalable, modular DBT architecture (models, macros, packages) Audit and refactor legacy SQL for clarity, efficiency, and modularity Improve CI/CD pipelines for DBT: automated testing, deployment, and code quality enforcement Collaborate with data analysts, platform engineers, and business stakeholders to understand current gaps and define future data pipelines Own Airflow orchestration redesign where needed (e.g., DBT Cloud/API hooks or airflow-dbt integration) Define and enforce coding standards, review processes, and documentation practices Coach junior data engineers on DBT and SQL best practices Provide lineage and impact analysis improvements using DBTs built-in tools and metadata Must-Have Qualifications 8+ years of experience in data engineering Proven success in migrating legacy SQL to DBT , with visible results Deep understanding of DBT best practices , including model layering, Jinja templating, testing, and packages Proficient in SQL performance tuning , modular SQL design, and query optimization Experience with Airflow (Composer, MWAA), including DAG refactoring and task orchestration Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery etc.) Familiarity with data testing and CI/CD for analytics workflows Strong communication and leadership skills; comfortable working cross-functionally Nice-to-Have Experience with DBT Cloud or DBT Core integrations with Airflow Familiarity with data governance and lineage tools (e.g., dbt docs, Alation) Exposure to Python (for custom Airflow operators/macros or utilities) Previous experience mentoring teams through modern data stack transitions

Posted 3 weeks ago

Apply

5.0 - 9.0 years

12 - 16 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

We are seeking a skilled ETL Data Tester to join our dynamic team on a 6-month contract. The ideal candidate will focus on implementing ETL processes, creating comprehensive test suites using Python, and validating data quality through advanced SQL queries. The role involves collaborating with Data Scientists, Engineers, and Software teams to develop and monitor data tools, frameworks, and infrastructure changes. Proficiency in Hive QL, Spark QL, and Big Data concepts is essential. The candidate should also have experience in data testing tools like DBT, iCEDQ, and QuerySurge, along with expertise in Linux/Unix and messaging systems such as Kafka or RabbitMQ. Strong analytical and debugging skills are required, with a focus on continuous automation and integration of data from multiple sources. Location: Chennai, Ahmedabad, Kolkata, Pune, Hyderabad, Remote

Posted 3 weeks ago

Apply

10.0 - 15.0 years

1 Lacs

Hyderabad

Remote

Complex Data Testing Engineer/ Relational Data Testing Specialist/ Big Data Testing Engineer

Posted 3 weeks ago

Apply

3.0 - 8.0 years

0 - 3 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Exp: 3+ Years JD: Primary Skills Data tester (functional testing) Automation experience preferably in Python/Pyspark Working knowledge of No-SQL preferably MongoDb (or json format) Working knowledge of AWS S3 Working knowledge of JIRA /confluence and defect management Understands Agile ways of working Years of experience : minimum 3 years Role & responsibilities Preferred candidate profile

Posted 4 weeks ago

Apply

4.0 - 6.0 years

10 - 13 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Role & responsibilities Data tester (functional testing) Automation experience preferably in Python/Pyspark Working knowledge of No-SQL preferably MongoDb (or json format) Working knowledge of AWS S3 Working knowledge of JIRA /confluence and defect management Understands Agile ways of working Years of experience : minimum 3 years Preferred candidate profile

Posted 4 weeks ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Lead test planning & execution Ensure SDLC & testing compliance Analyze requirements & estimate effort Liaise with stakeholders & teams Track defects & test progress Drive end-to-end testing delivery Required Candidate profile Strong in ETL, SQL, Control-M Proficient in Agile/Waterfall, JIRA, Confluence Exp in DW, Informatica, Azure DevOps Skilled in automation, stakeholder mgmt, comms Onsite-offshore coordination

Posted 4 weeks ago

Apply

4.0 - 8.0 years

8 - 10 Lacs

Chennai

Work from Office

Key Responsibilities Data Analyst (Testing & Validation): 1. Perform end-to-end testing of ETL processes including data extraction, transformation, and loading across different data sources. Conduct data validation and reconciliation to ensure accuracy, completeness, and integrity of data across systems. Write and execute complex SQL queries to validate data quality and perform data analysis. Collaborate with developers, data engineers, and business analysts to understand data requirements and test cases. Design, document, and maintain test plans, test cases, and test scripts for data pipelines and BI reports. Identify and report data issues or anomalies , track defects, and verify resolutions through retesting. Use tools like Informatica, Talend, SSIS, or similar for ETL testing (based on tech stack). Ensure compliance with data governance, privacy, and quality standards during testing.

Posted 4 weeks ago

Apply

2.0 - 6.0 years

3 - 8 Lacs

Bengaluru

Hybrid

Primary Skills Data tester (functional testing) Automation experience preferably in Python/Pyspark Working knowledge of No-SQL preferably MongoDb (or json format) Working knowledge of AWS S3 Working knowledge of JIRA /confluence and defect management Understands Agile ways of working Years of experience : minimum 3 years

Posted 4 weeks ago

Apply

7.0 - 10.0 years

15 - 30 Lacs

Greater Noida

Work from Office

Job Overview: Responsible for validating data pipelines, ensuring data integrity, and verifying the performance and functionality of solutions built on the Microsoft Fabric platform. This includes testing data ingestion, transformation, and reporting components, particularly within environments like the Delta Lakehouse. Key Responsibilities: Design and execute test plans and test cases for Microsoft Fabric-based data solutions. Validate data flows from source systems to the Microsoft Fabric Delta Lakehouse. Perform data quality checks, reconciliation, and validation of large datasets. Collaborate with data engineers, architects, and BI developers to ensure end-to-end data accuracy. Automate testing processes using tools like Azure Data Factory, Power BI, and Fabric-native tools. Identify, document, and track defects and inconsistencies. Ensure compliance with data governance and security standards. Required Skills: Strong experience in data testing, preferably in cloud-based environments. Familiarity with Microsoft Fabric components (e.g., Data Factory, Synapse, Power BI, Delta Lakehouse). Proficiency in SQL for data validation and transformation testing. Experience with test automation frameworks and scripting. Understanding of ETL/ELT processes and data modelling concepts. Excellent analytical and problem-solving skills.

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 15 Lacs

Noida

Work from Office

5-8 years of experience in Data Testing using SQL Knowledge of Python is a plus Domain expertise in Risk and Finance IT Availability to provide overlap with EST hours (until noon EST) Mandatory Competencies ETL - Tester QA Manual - Manual Testing Database - SQL Beh - Communication and collaboration

Posted 1 month ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Noida

Work from Office

Perform test automation analysis and identify feasible areas for automation Identify, document, and track software defects and issues using bug tracking tools (e.g., JIRA, Bugzilla). Work closely with development teams to prioritize and resolve defects, ensuring timely delivery of high-quality software. Own requirements of their features, perform effort estimation, plan testing, design, create & execute automation scripts using ConfirmIQ, Hexawise Experience in selenium for web testing, RestAssured for API or Appium for Mobile testing . For Data testing SQL is used knowledge of python is good to have Proficient in programming languages such as Java, Python. Experience with test automation tools (e.g., Selenium, TestNG, JUnit). Familiarity with performance testing tools (e.g., JMeter, LoadRunner). Knowledge of CI/CD tools and processes (e.g., Jenkins, GitLab CI). Good understanding of software development methodologies (e.g., Agile, Scrum). Mandatory Competencies QA Automation - QA Automation QA Automation - Selenium Database - SQL DevOps - Jenkins DevOps - Gitlab

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Noida

Work from Office

5-8 years of experience in Data Testing using SQL Knowledge of Python is a plus Domain expertise in Risk and Finance IT Mandatory Competencies ETL - Tester QA Manual - Manual Testing Beh - Communication and collaboration Database - SQL

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies