Jobs
Interviews

7dxperts

We are excited about the launch of 7Dxperts, as part of our teams ongoing commitment to driving growth and innovation in the space data, analytics, ML and geospatial . To ensure our continued growth and focus, we made the strategic decision to spin out the analytics business from zsah ltd. This move will enable us to invest more in our propositions and our staff while pushing the boundaries of what's possible in the realm of data. We firmly believe that targeted solutions designed for specific use cases hold more power than generic solutions. Therefore, at the core of the business, its about bringing people together who care about customers, have passion to solve problems and xpertise in building targeted accelerators/solutions for industry specific problems. 📌 Visit our website to get to know us better.

7 Job openings at 7dxperts
Data Warehouse /ETL - Test Lead/Manager Bengaluru 7 - 12 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

Role & responsibilities Lead and manage a team of QA testers for ETL, data warehouse, and BI report testing. Define and implement test strategies and test plans for DWH projects, covering data pipelines, data integrity, and reporting accuracy. Develop and execute comprehensive test cases and SQL queries to validate data accuracy, completeness, and transformation logic. Test ETL workflows for data extraction, transformation, and loading processes, ensuring alignment with business requirements. Perform data validation testing and reconciliation across data sources, staging, and data warehouses. Test and validate BI reports and dashboards built using tools like Tableau, Power BI, and ThoughtSpot, ensuring data correctness, visual accuracy, and performance. Collaborate with BI developers and stakeholders to validate KPIs, data visualizations, and user interface requirements. Perform performance testing for complex BI dashboards, ensuring scalability and optimal query performance. Oversee defect management and issue resolution using tools like Jira or Azure DevOps. Conduct root cause analysis for data issues and collaborate with engineering teams for resolution. Automate data validation processes using scripting tools such as Python or frameworks like dbt to optimize testing efficiency. Generate test reports and metrics, providing updates on testing progress, issues, and resolution status. Drive continuous improvements in QA processes, tools, and techniques to ensure testing scalability and efficiency. Ensure all testing adheres to organizational quality standards and best practices for DWH and BI projects. Preferred candidate profile Bachelors degree in Computer Science, Information Systems, or a related field. 5+ years of experience in software testing with a strong focus on Data Warehousing, ETL testing , and BI testing . Solid understanding of data warehouse concepts (star schema, snowflake schema, OLAP, and dimensional modeling). Proficiency in writing complex SQL queries for data validation, reconciliation, and transformation testing. Hands-on experience testing ETL tools such as Informatica, Talend, Apache Airflow, or dbt . Expertise in testing and validating reports and dashboards built on BI tools: Tableau Power BI ThoughtSpot Familiarity with cloud-based DWH platforms like Snowflake , Databricks , AWS Redshift , or Azure Synapse . Experience with defect management tools such as Jira , TestRail , or Azure DevOps . Strong analytical skills with the ability to troubleshoot data quality and performance issues. Experience with performance testing and optimization for BI dashboards and large-scale datasets. Excellent communication, leadership, and stakeholder management skills. Hands-on experience in automating data validation using scripting languages like Python or tools like dbt. Familiarity with Big Data tools like Apache Spark, Hive, or Kafka. ISTQB certification or similar QA certifications. Working knowledge of CI/CD pipelines and integration of automated data tests. Experience with data governance, security, and compliance practices. Perks and benefits Training in Databricks. Training and certification in Thoughtspot. Support on certifications. Hands-on AWS, Azure, and GCP. Support you on any cloud certification. Build leadership skills.

Data Engineer Bengaluru 5 - 8 years INR 15.0 - 20.0 Lacs P.A. Work from Office Full Time

Role & responsibilities 3+ years of experience in Spark, Databricks, Hadoop, Data and ML Engineering. 3+ Years on experience in designing architectures using AWS cloud services & Databricks. Architecture, design and build Big Data Platform (Data Lake / Data Warehouse / Lake house) using Databricks services and integrating with wider AWS cloud services. Knowledge & experience in infrastructure as code and CI/CD pipeline to build and deploy data platform tech stack and solution. Hands-on spark experience in supporting and developing Data Engineering (ETL/ELT) and Machine learning (ML) solutions using Python, Spark, Scala or R languages. Distributed system fundamentals and optimising Spark distributed computing. Experience in setting up batch and streams data pipeline using Databricks DLT, jobs and streams. Understand the concepts and principles of data modelling, Database, tables and can produce, maintain, and update relevant data models across multiple subject areas. Design, build and test medium to complex or large-scale data pipelines (ETL/ELT) based on feeds from multiple systems using a range of different storage technologies and/or access methods, implement data quality validation and to create repeatable and reusable pipelines Experience in designing metadata repositories, understanding range of metadata tools and technologies to implement metadata repositories and working with metadata. Understand the concepts of build automation, implementing automation pipelines to build, test and deploy changes to higher environments. Define and execute test cases, scripts and understand the role of testing and how it works. Preferred candidate profile Big Data technologies Databricks, Spark, Hadoop, EMR or Hortonworks. Solid hands-on experience in programming languages Python, Spark, SQL, Spark SQL, Spark Streaming, Hive and Presto Experience in different Databricks components and API like notebooks, jobs, DLT, interactive and jobs cluster, SQL warehouse, policies, secrets, dbfs, Hive Metastore, Glue Metastore, Unity Catalog and ML Flow. Knowledge and experience in AWS Lambda, VPC, S3, EC2, API Gateway, IAM users, roles & policies, Cognito, Application Load Balancer, Glue, Redshift, Spectrum, Athena and Kinesis. Experience in using source control tools like git, bit bucket or AWS code commit and automation tools like Jenkins, AWS Code build and Code deploy. Hands-on experience in terraform and Databricks API to automate infrastructure stack. Experience in implementing CI/CD pipeline and ML Ops pipeline using Git, Git actions or Jenkins. Experience in delivering project artifacts like design documents, test cases, traceability matrix and low-level design documents. Build references architectures, how-tos, and demo applications for customers. Ready to complete certifications

Lead Application Developer - Python, Django, Flask Bangalore Rural, Bengaluru 7 - 10 years INR 15.0 - 30.0 Lacs P.A. Work from Office Full Time

Role & responsibilities Define and drive the technical vision and strategic roadmap for Data APIs and related platform capabilities. Work closely with product management and stakeholders to translate requirements into scalable solutions. Lead initiatives around AI/ML integration, ensuring alignment with business goals and technical feasibility. Champion best practices in API design, performance, security, and maintainability. Manage project timelines and deliverables, ensuring on-time and high-quality outcomes. Provide mentorship and technical guidance to developers across the team. Design, develop, and maintain RESTful APIs using Python with Flask or Django frameworks. Develop backend services that support data ingestion, geospatial analysis, and real-time data access. Write efficient, optimized SQL queries across PostgreSQL, MS SQL, and MySQL databases. Implement secure authentication and authorization mechanisms (SSO, OAuth2). Integrate and operationalize AI/ML models within backend systems. Troubleshoot and resolve complex issues in a production environment. Lead code reviews and enforce clean, scalable code practices. Provide hands-on mentorship and technical guidance to mid-level and junior developers. Contribute to and improve CI/CD pipelines and development workflows. Debug and resolve complex technical issues and guide the team in best practices. Preferred candidate profile 5+ years of professional experience in backend/API development with Python. 3+ years in a technical leadership or lead developer capacity. Strong experience with Flask, Django, or similar Python web frameworks. Solid expertise in RESTful API design, development, and maintenance. Knowledge in Large language models and integrating with in the application Fine tuning LLM models Understanding of Agentic AI and agent development Advanced SQL proficiency and experience working with PostgreSQL, MySQL, or MS SQL. Familiarity with Git, CI/CD pipelines, and collaborative development workflows. Understanding of SSO/OAuth protocols and security best practices in web applications. Experience in defining and delivering a technical roadmap and working in Agile/Scrum teams. Proven ability to lead teams, mentor developers, and manage priorities in a cross-functional environment. Excellent communication skills and ability to work across technical and non-technical teams. Good to Have: Hands-on experience with AI/ML integration in applications or data pipelines. Familiarity with PySpark, SparkSQL, and platforms like Databricks. Knowledge of the Data Analytics and geospatial domains. Exposure to Linux administration and DevOps tools. Understanding of ReactJS or similar frontend libraries (for full-stack awareness). Passion for innovation and continuous learning.

Etl Test Analyst Bengaluru 2 - 7 years INR 15.0 - 20.0 Lacs P.A. Work from Office Full Time

Role & responsibilities Expertise in writing ETL test scenarios, test cases, and test scripts. Experience and understanding of test life cycle , defect management life cycle etc. Ensure that test cases cover various ETL scenarios, including data extraction, transformation, and loading. Execution of ETL test cases and work closely with the QA Lead, development and data teams to resolve defects and inconsistencies. Develop and Execute SQL queries to validate data transformations and ensure data integrity. Implement and maintain data quality checks to monitor ETL process performance. Analyze the Source to Target Mapping Documents. Test ETL Mappings to extract data from multiple sources like SQL Server, Flat files, etc., into target tables using transformations. Ensure data accuracy, consistency, and compliance with regulatory requirements. Identify and report defects while working closely with QA Lead and development teams to ensure timely resolution. Perform root cause analysis for defects and implement preventive measures. Contribute to the identification and implementation of automation opportunities in ETL testing. Collaborate with automation engineers to script and execute automated test cases. Create automation test scripts to verify and validate the quality of the product. Execution and maintenance of the automation test scripts. Preferred candidate profile Bachelors degree in computer science, Information Technology, or a related field. Proven experience (2 to 5 years) in ETL testing, with a strong understanding of data integration concepts and methodologies. Strong SQL skills for querying and validating data. Understanding of relational databases and data warehouse concepts. Familiarity with data warehousing, data modeling, and data transformation concepts. Excellent communication and interpersonal skills. Problem-solving mindset and ability to troubleshoot complex issues. Experience with test automation tools and frameworks is a plus. Knowledge of scripting languages (e.g., Python, Shell) for test automation. Must have the ability to multi-task and adapt quickly to changes while maintaining urgency in completing assigned tasks. Should be detail oriented with analytical mindset. Have worked in Agile Environments.

Technical Delivery Lead - Cloud DWH Bengaluru 7 - 8 years INR 15.0 - 25.0 Lacs P.A. Work from Office Full Time

Role & responsibilities Oversee the design, implementation, and optimization of data warehousing solutions leveraging tools like Snowflake , Databricks , and other cloud data platforms. Lead the delivery of software projects from initiation through implementation. Lead the delivery of ETL processes for ingesting, transforming, and managing large-scale datasets. Lead the delivery of Data Analytics dashboards and reports on modern data stacks Develop project plans , allocate resources, and track progress using project management tools such as Jira, Asana, Trello, or MS Project . Act as the primary point of contact for clients, building strong relationships, providing regular updates, and addressing concerns promptly. Manage risks and resolve project roadblocks to ensure timely delivery of high-quality solutions. Ensure projects align with data governance best practices, security protocols, and client standards. Provide technical guidance to the development team, ensuring high-quality and timely delivery. Work with stakeholders to define KPIs and ensure delivery meets the business and technical goals. Drive continuous improvement initiatives in delivery processes, data quality, and team efficiency. Provide leadership and mentoring to project teams, fostering a culture of collaboration, accountability, and excellence. Preferred candidate profile Bachelors degree in computer science, Information Systems, Data Engineering, or a related field. 5+ years of experience managing the delivery of Data Warehouse , data engineering and data analytics projects . Strong experience with cloud-based data platforms such as Snowflake , Databricks , or Amazon Redshift. Proficiency in managing ETL pipelines and understanding data transformation processes . Solid knowledge of data warehousing concepts (e.g., dimensional modelling, star/snowflake schema, OLAP/OLTP systems). Experience working with SQL for data querying, performance optimization, and testing. Proven ability to manage multiple stakeholders, prioritize tasks, and ensure client satisfaction. Proficiency with project management tools: Jira, Asana, Trello, or MS Project . Familiarity with Agile, Scrum, and Waterfall methodologies.

Technical Delivery Lead - Cloud DWH bengaluru 6 - 9 years INR 20.0 - 35.0 Lacs P.A. Work from Office Full Time

Role & responsibilities: Oversee the design, implementation, and optimization of data warehousing solutions leveraging tools like Snowflake , Databricks , and other cloud data platforms. Lead the delivery of software projects from initiation through implementation. Lead the delivery of ETL processes for ingesting, transforming, and managing large-scale datasets. Lead the delivery of Data Analytics dashboards and reports on modern data stacks Develop project plans , allocate resources, and track progress using project management tools such as Jira, Asana, Trello, or MS Project . Act as the primary point of contact for clients, building strong relationships, providing regular updates, and addressing concerns promptly. Manage risks and resolve project roadblocks to ensure timely delivery of high-quality solutions. Ensure projects align with data governance best practices, security protocols, and client standards. Provide technical guidance to the development team, ensuring high-quality and timely delivery. Work with stakeholders to define KPIs and ensure delivery meets the business and technical goals. Drive continuous improvement initiatives in delivery processes, data quality, and team efficiency. Provide leadership and mentoring to project teams, fostering a culture of collaboration, accountability, and excellence. Preferred candidate profile: Bachelors degree in computer science, Information Systems, Data Engineering, or a related field. 5+ years of experience managing the delivery of Data Warehouse , data engineering and data analytics projects . Strong experience with cloud-based data platforms such as Snowflake , Databricks , or Amazon Redshift. Proficiency in managing ETL pipelines and understanding data transformation processes . Solid knowledge of data warehousing concepts (e.g., dimensional modelling, star/snowflake schema, OLAP/OLTP systems). Experience working with SQL for data querying, performance optimization, and testing. Proven ability to manage multiple stakeholders, prioritize tasks, and ensure client satisfaction. Proficiency with project management tools: Jira, Asana, Trello, or MS Project . Familiarity with Agile, Scrum, and Waterfall methodologies. Overview of Role: The Delivery Lead will lead the successful delivery of Data Warehouse projects for enterprise clients. The role requires expertise in managing large-scale data initiatives, including planning, execution, and monitoring of complex Data Warehouse (DW) and ETL pipelines . The ideal candidate will have experience delivering projects using platforms such as Snowflake , Databricks , and other modern cloud-based data warehousing solutions. What you can expect from us We appreciate that individual growth is important as well and we support you on every aspect of personal development. Training in Databricks. Training and certification in Thoughtspot. Support on certifications. Hands-on AWS, Azure, and GCP. Support you on any cloud certification. Build leadership skills.

Lead Application Developer - Python, Django bengaluru 7 - 10 years INR 15.0 - 30.0 Lacs P.A. Hybrid Full Time

Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.