Jobs
Interviews

4 Qa Frameworks Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 19.0 years

0 - 0 Lacs

karnataka

On-site

You should have over 15 years of experience in Project Delivery, Client Engagement, or Analytics Delivery Management. Your background should include strong experience in Analytics/Consulting/IT Services, particularly in handling complex project execution & governance. You should possess client-facing skills with exposure to US/Global clients and have experience in handling client escalations & governance reviews. It is important that you are familiar with tools & methodologies such as DS & DE at a scale of 7/10, QA frameworks, risk tracking, BI dashboards (Tableau/Power BI), JIRA, ServiceNow, or similar. Your leadership abilities should enable you to manage multiple project teams, mentor Delivery Managers, and drive process improvement. You should also have experience in managing team sizes of 50+ and handling a portfolio of $4-5mn or more.,

Posted 2 weeks ago

Apply

15.0 - 17.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Experience: 15+ years in Project Delivery, Client Engagement, or Analytics Delivery Management. ? Background: Strong experience in Analytics/Consulting/IT Services, handling complex project execution & governance. ? Client-Facing Skills: Exposure to US/Global clients, with experience in handling client escalations & governance reviews. Tools & Methodologies: Familiarity with DS & DE on a scale of 7/10; QA frameworks, risk tracking, BI dashboards (Tableau/Power BI), JIRA, ServiceNow, or similar. ? Leadership: Ability to manage multiple project teams, mentor Delivery Managers, and drive process improvement. ? Having managed team size of 50+; and a portfolio of $4-5mn + Show more Show less

Posted 3 weeks ago

Apply

7.0 - 10.0 years

27 - 37 Lacs

Gurugram

Work from Office

We are seeking a seasoned QA and reliability engineering leader to define and drive our automation-first quality strategy , deeply embedded in our CI/CD workflows and platform engineering culture. This role is core to delivering robust, single-click pipelines , high-reliability environments, and data-driven insights at scale. As a Senior Engineer , youll own strategy and execution for test automation, synthetic data generation, load and performance testing, resilience testing , and environment optimization across platform services. Your Role & Responsibilities Build and scale automated QA frameworks across services, APIs, and UI for platform-grade quality Drive test enforcement in single-click CI/CD pipelines using GitHub Actions, ArgoCD, Jenkins, or equivalent Define and implement automated test data creation strategies using factories, mocks, and synthetic data generation tools Design and manage on-demand performance and load test environments using tools like k6, JMeter, Gatling, or Locust Architect and orchestrate resilience, fault-injection, and chaos testing strategies for critical platform components Establish performance environment strategy for multi-tenant, scalable systems with dynamic test coverage Define and track SLOs, SLAs, and error budgets , collaborating with SRE and observability teams Drive shift-left testing practices and embed automated validations early in the development lifecycle Enable self-service quality tools for developers and platform teams to improve test feedback loops Continuously evolve NFR validation strategies to ensure performance, stability, and availability benchmarks are met Skills & Experience Required 6+ years in QA automation, reliability engineering, or platform QA roles Hands-on experience with automation frameworks: Playwright, Selenium, REST-assured , or similar Proven experience in load, stress, and chaos testing for cloud-native distributed systems Deep experience with test data strategy , including synthetic test data, mocking, and data anonymization Expertise with CI/CD tooling and integration of quality gates within pipelines (ArgoCD, GitHub Actions, Jenkins, etc.) Strong knowledge of cloud environments (GCP, Azure) and Kubernetes-based deployments Proficient with performance monitoring and observability stacks (Grafana, Prometheus, New Relic, etc.) Familiarity with contract testing, component testing , and non-functional validation techniques Nice to Have Experience working with multi-tenant systems and complex environment reduction strategies Knowledge of test virtualization or mock service generation for isolated testing Exposure to GenAI tools in test acceleration, log summarization, or automated test generation Prior contribution to chaos engineering frameworks like Gremlin, Litmus, ChaosMonkey or ChaosMesh

Posted 1 month ago

Apply

9.0 - 12.0 years

15 - 30 Lacs

gurugram

Remote

We are looking for an experienced Senior Data Engineer to lead the development of scalable AWS-native data lake pipelines , with a strong focus on time series forecasting , upsert-ready architectures , and enterprise-grade data governance . This role demands end-to-end ownership of the data lifecycle from ingestion to partitioning, versioning, QA, lineage tracking, and BI delivery. The ideal candidate will be highly proficient in AWS data services , PySpark , and versioned storage formats such as Apache Hudi or Iceberg . A strong understanding of data quality , observability , governance , and metadata management in large-scale analytical systems is critical. Roles & Responsibilities Design and implement data lake zoning (Raw Clean Modeled) using Amazon S3, AWS Glue, and Athena. Ingest structured and unstructured datasets including POS, USDA, Circana, and internal sales data. Build versioned and upsert-ready ETL pipelines using Apache Hudi or Iceberg. Create forecast-ready datasets with lagged, rolling, and trend features for revenue and occupancy modeling. Optimize Athena datasets with partitioning, CTAS queries, and S3 metadata tagging. Implement S3 lifecycle policies, intelligent file partitioning, and audit logging for performance and compliance. Build reusable transformation logic using dbt-core or PySpark to support KPIs and time series outputs. Integrate data quality frameworks such as Great Expectations, custom logs, and AWS CloudWatch for field-level validation and anomaly detection. Apply data governance practices using tools like OpenMetadata or Atlan, enabling lineage tracking, data cataloging, and impact analysis. Establish QA automation frameworks for pipeline validation, data regression testing, and UAT handoff. Collaborate with BI, QA, and business teams to finalize schema design and deliverables for dashboard consumption. Ensure compliance with enterprise data governance policies and enable discovery and collaboration through metadata platforms. Preferred Candidate Profile 9-12 years of experience in data engineering. Deep hands-on experience with AWS Glue, Athena, S3, Step Functions, and Glue, Data Catalog. Strong command over PySpark, dbt-core, CTAS query optimization, and advanced partition strategies. Proven experience with versioned ingestion using Apache Hudi, Iceberg, or Delta Lake. Experience in data lineage, metadata tagging, and governance tooling using OpenMetadata, Atlan, or similar platforms. Proficiency in feature engineering for time series forecasting (lags, rolling windows, trends). Expertise in Git-based workflows, CI/CD, and deployment automation (Bitbucket or similar). Strong understanding of time series KPIs: revenue forecasts, occupancy trends, demand volatility, etc. Knowledge of statistical forecasting frameworks (e.g., Prophet, GluonTS, Scikit-learn). Experience with Superset or Streamlit for QA visualization and UAT testing. Experience building data QA frameworks and embedding data validation checks at each stage of the ETL lifecycle. Independent thinker capable of designing systems that scale with evolving business logic and compliance requirements. Excellent communication skills for collaboration with BI, QA, data governance, and business stakeholders. High attention to detail, especially around data accuracy, documentation, traceability, and auditability.

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies