Jobs
Interviews

52 Etl Automation Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

7 - 17 Lacs

Bengaluru

Work from Office

We are seeking a skilled ETL/DWH Tester to support CXM testing for our insurance sector clients. The candidate will possess strong expertise in ETL testing, SQL, and experience in the insurance domain. Proficiency in Agile, JIRA, and ETL automation.

Posted 3 months ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Job Track Description: Create and support complex DB Objects like Stored Procedures, Custom Functions and Triggers for both Data Warehouse and application development. Performance tuning of DB Objects, ETL and SSRS Reports. Design the ETL, Analysis and Reporting solutions. Independently analyze, solve, and correct issues in real time, providing problem resolution end-to-end. Create and support ETL processes using SSIS and SQL Server Jobs Convert legacy reports to modern technologies using Microsoft BI platform. Develop and maintain a thorough understanding of business needs from both technical and business perspectives. Guide the team members for the best solutions. Effectively prioritize and execute tasks in a high-pressure environment. Qualifications / Experience Bachelor\u2019s/Master\u2019s degree in Computer Science / Computer Engineering or equivalent experience 6+ years of experience in SQL Server 5+ years of experience in Microsoft BI Stack & Reporting solutions (mainly Power BI & SSRS) including development of reports (RDLs, T-SQL, complex stored procedures, subscriptions, MDX), and Dimensions, Facts & Cubes Data Modeling/Data Warehouse Design experience Experience in ETL packages using SSIS is desirable. Strong understanding of SDLC and version control Excellent oral and written communication skills

Posted 3 months ago

Apply

3.0 - 10.0 years

1 - 2 Lacs

Pune, Maharashtra, India

On-site

Our client is an EU subsidiary of a Global Financial Bank working in multiple markets and asset classes. DWH / ETL Tester will work closely with the Development Team to design, build interfaces and integrate data from a variety from internal and external data sources into the new Enterprise Data Warehouse environment. The ETL Tester will be primarily responsible for testing Enterprise Data Warehouse using Automation within industry recognized ETL standards, architecture, and best practices. Responsibilities Perform intake of new ETL Project & initiatives, make the high-level assessment in collaboration with the leadership of the roadmap. Design Test Strategy and Test Plan to address the needs of Cloud Based ETL Pipelines. Contribute and manage Testing Deliverables. Ensure the implementation of test standards and best practices for the agile model & contributes to their development. Engage with internal stakeholders in various areas of the organization to seek alignment and collaboration. Deals with external stakeholders / Vendors. Identify risks / issues and present associated mitigating actions taking into account the critically of the domain of the underlying business. Contribute to continuous improvement of testing standard processes. Skills Expert level knowledge on Data Warehouse, RDBMS concepts. Expertise on new age cloud-based Data Warehouse solutions ADF, SnowFlake, GCP etc. Hands-On expertise in writing complex SQL using multiple JOINS and highly complex functions to test various transformations and ETL requirements. Knowledge and Experience on creating Test Automation for Database and ETL Testing Regression Suite. Automation using Selenium with Python (or Java Script), Python Scripts, Shell Script. Knowledge of framework designing, REST API Testing of databases using Python. Experience using Atlassian tool set, Azure DevOps. Experience in Code & Version Management GIT, Bitbucket, Azure Repos etc. Qualifications A bachelor's degree or equivalent experience in computer science or similar. Experience in crafting test strategies and supervising ETL DWH test activities on multi-platform & sophisticated Cloud based environments. Strong analytical mind-set with the ability to extract relevant information from documentation, system data, clients and colleagues and analyze the captured information. ISTQB Foundation Certificate in Software testing Optional/Preferred experience in the financial industry, knowledge of Regulatory Reporting and the terms/terminology used. Important to Have Proficiency in English read/write/speak. Able to demonstrate your ability to learn new technologies. Able to easily adapt to new circumstances / technologies / procedures. Stress resistant and constructive whatever the context. Able to align with existing standards and acting with attention to detail A true standout colleague who demonstrates good interpersonal skills. Able to summarize complex technical situations in simple terms. Solution and customer focused. Good communication skills, a positive attitude, and a competitive, but team-oriented focus are key elements to be successful in this challenging environment. Nice to have Experience in the financial industry, knowledge of Regulatory Reporting and the terms/terminology used.

Posted 3 months ago

Apply

3.0 - 5.0 years

6 - 8 Lacs

Chandigarh

Work from Office

Role Overview We are seeking a talented ETL Engineer to design, implement, and maintain end-to-end data ingestion and transformation pipelines in Google Cloud Platform (GCP). This role will collaborate closely with data architects, analysts, and BI developers to ensure high-quality, performant data delivery into BigQuery and downstream Power BI reporting layers. Key Responsibilities Data Ingestion & Landing Architect and implement landing zones in Cloud Storage for raw data. Manage buckets/objects and handle diverse file formats (Parquet, Avro, CSV, JSON, ORC). ETL Pipeline Development Build and orchestrate extraction, transformation, and loading workflows using Cloud Data Fusion. Leverage Data Fusion Wrangler for data cleansing, filtering, imputation, normalization, type conversion, splitting, joining, sorting, union, pivot/unpivot, and format adjustments. Data Modeling Design and maintain fact and dimension tables using Star and Snowflake schemas. Collaborate on semantic layer definitions to support downstream reporting. Load & Orchestration Load curated datasets into BigQuery across different zones (raw, staging, curated). Develop SQL-based orchestration and transformation within BigQuery (scheduled queries, scripting). Performance & Quality Optimize ETL jobs for throughput, cost, and reliability. Implement monitoring, error handling, and data quality checks. Collaboration & Documentation Work with data analysts and BI developers to understand requirements and ensure data readiness for Power BI. Maintain clear documentation of pipeline designs, data lineage, and operational runbooks. Required Skills & Experience Bachelors degree in Computer Science, Engineering, or related field. 3+ years of hands-on experience building ETL pipelines in GCP. Proficiency with Cloud Data Fusion , including Wrangler transformations. Strong command of SQL , including performance tuning in BigQuery. Experience managing Cloud Storage buckets and handling Parquet, Avro, CSV, JSON, and ORC formats. Solid understanding of dimensional modeling: fact vs. dimension tables, Star and Snowflake schemas. Familiarity with BigQuery data zones (raw, staging, curated) and dataset organization. Experience with scheduling and orchestration tools (Cloud Composer, Airflow, or BigQuery scheduled queries). Excellent problem-solving skills and attention to detail. Preferred (Good to Have) Exposure to Power BI data modeling and DAX. Experience with other GCP services (Dataflow, Dataproc). Familiarity with Git, CI/CD pipelines, and infrastructure as code (Terraform). Knowledge of Python for custom transformations or orchestration scripts. Understanding of data governance best practices and metadata management.

Posted 3 months ago

Apply

2.0 - 7.0 years

1 - 6 Lacs

Pune, Chennai

Work from Office

Role: TSO Exp: Min 2 years Must have Exp in Data Migration Testing (ETL Testing), Manual & Automation, Agile Scrum Methodology and Financial domain experience Salary: Max 5.25 LPA. Location: Chennai/Pune Regards, Vicky 7200396456.

Posted 3 months ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai

Work from Office

Data Scientist (6-8 Years Experience) Company Overview Accenture is a global professional services company with leading capabilities in digital, cloud, and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Interactive, Technology, and Operations services"”all powered by the world's largest network of Advanced Technology and Intelligent Operations centers. Our 700,000+ people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners, and communities. At Accenture, we offer a dynamic and challenging environment for analytical and creative minds. We value a diverse workforce committed to achieving client goals and delivering results. Were built on a foundation of exceptional talent and are on the lookout for driven, energetic professionals to enhance our team. Key Responsibilities Data Handling Data Collection:Participate in calls with stakeholders (internal and external, based in the US) to gather data from various sources (email, Dropbox, Egnyte, databases). Data Audit:Lead assessment of data quality, identify gaps, and create summaries as per database requirements. Data Scrubbing:Assist with creating data cleansing rules and incorporate data clarifications provided by data source owners. Data Profiling:Assist with creating multi-dimensional data views, data analysis reports, and extracts. Data Classification Spend Classification:Analyze procurement spend using several techniques to comprehensively classify into a custom taxonomy in Accenture's spend analytics tool. Enhancements:Diligently incorporate feedback and make recommendations for process improvement. Report Generation:Create specific and opportunity spend-assessment reports/templates. Periodic Refreshes:Lead discussions with US-based stakeholders for data gathering, data quality checks, control total validation, and spend classification. Advanced Analytics and AI/ML Develop custom data models and algorithms to apply to data sets. Use machine learning tools and statistical techniques to produce solutions to problems. Implement clustering and auto classification using predictive and supervised learning techniques. Design and implement complex data models from scratch. Develop and optimize ETL processes to ensure efficient data handling and processing. Create intuitive and effective front-end interfaces from scratch. Apply AI/ML techniques to optimize supply chain management, including demand forecasting, inventory optimization, and supplier performance analysis. Utilize advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Develop and implement AI/ML models for predictive analytics and automated decision-making in supply chain operations. Industry Research Secondary Research:Conduct market research to create company and industry primers using online secondary data or information sources. Technical Requirements Python:Hands-on experience with threading limitations and multi-process architecture. MySQL:Ability to integrate multiple data sources using MySQL. Strong coding knowledge and experience with several languages (e.g., R, SQL, JavaScript, Java, CSS, C++). Familiarity with statistical and data mining techniques (e.g., GLM/Regression, Random Forest, Boosting, Trees, text mining, social network analysis). Experience with advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Collaboration and Communication Coordinate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. Excellent spoken and written English communication skills, with the ability to participate in global team calls. Other Important Details Work Location:Gurugram, India Expected Start Dates:July 2024 Qualifications Who Should Apply? Work Experience:6-8 years of relevant experience in data modeling, ETL automation, AI/ML, and front-end design. Academic Qualifications:Bachelor’s or Master’s degree in Engineering, Mathematics, Computer Science, or a related field. Job Requirements: - Extensive experience in handling and classifying spend data using AI/ML Techniques. - Strong leadership and team management skills. - Proficiency in MS Excel and MS PowerPoint. - High attention to detail, accuracy, and innovative problem-solving skills. - Preferred experience in supply chain management, with a focus on applying AI/ML techniques to optimize operations.

Posted 3 months ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Chennai, Coimbatore

Hybrid

Greetings from Cognizant!!! We are hiring for Permanent Position with Cognizant. Experience:- 3 - 6 Yrs. Mandatory to have experience in ETL Testing, Mongo DB, Python, SQL, Azure databricks . Work Location: Chennai, Coimbatore Interview Mode : ( Virtual ) Interview Date : Weekday & Weekend

Posted 3 months ago

Apply

5.0 - 10.0 years

10 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Experience:- 5-11 years Location:-Bangalore/Chennai/Hyderabad/Pune (F2F on 7th June) Notice Period:- Immediate Key Responsibilities: - Design and execute test plans, test cases, and test scripts for ETL processes - Validate data quality, integrity, and accuracy in source and target systems - Identify and report defects, and collaborate with ETL developers to resolve issues - Perform data profiling, data validation, and data reconciliation - Develop and maintain test automation frameworks and scripts - Lead and mentor junior testers, and provide guidance on testing best practices - Collaborate with stakeholders to understand testing requirements and ensure compliance with testing standards

Posted 3 months ago

Apply

3.0 - 5.0 years

13 - 18 Lacs

Mumbai, Pune

Work from Office

locationsMumbai - HiranandaniPune - Business Bayposted onPosted 3 Days Ago time left to applyEnd DateJune 30, 2025 (30 days left to apply) job requisition idR_307045 Company: Marsh Description: We are seeking a talented individual to join our Benefit Analytics Team at Marsh. This role will be based in Mumbai. This is a hybrid role that has a requirement of working at least three days a week in the office. Principal Engineer - Data Science We will count on you to: Design and implement data analytics products that utilize web-based technologies to solve complex business problems and drive strategic outcomes. Utilize strong conceptual skills to explore the "Art of the Possible" in analytics, integrating data, market trends, and cutting-edge technologies to inform business strategies. Manage and manipulate large datasets from diverse sources, ensuring data quality through cleaning, consolidation, and transformation into meaningful insights. Conduct exploratory data analysis (EDA) to identify patterns and trends, reporting key metrics and synthesizing disparate datasets for comprehensive insights. Perform rigorous quality assurance (QA) on datasets, ensuring accuracy, logical consistency, and alignment with analytical dashboards. Automate data capture processes from various sources, streamlining data cleaning and insight generation workflows. Apply knowledge of insurance claims, policies, terminologies, health risks, and wellbeing to enhance analytical models and insights. Collaborate with cross-functional teams to develop and deploy machine learning models and predictive analytics solutions. Utilize SQL for database management and data manipulation, with a focus on optimizing queries and data retrieval processes. Develop ETL Automation pipelines using tools such as Python, GenAI and ChatGPT APIs ensuring efficient and optimized code. Communicate complex data-driven solutions clearly and effectively, translating technical findings into actionable business recommendations. Having knowledge around LLM/RAG/Power BI/Tableau will be preferred What you need to have: Educational Background A Bachelors or Masters degree in Computer Science, Information Technology, Mathematics, Statistics, or a related field is essential. A strong academic foundation will support your analytical and technical skills. Experience 3-5 years of progressive experience in a Data Science or Data Analytics role, demonstrating a solid track record of delivering impactful data-driven insights and solutions. Technical Proficiency : Programming Skills Advanced proficiency in Python is required, with hands-on experience in data engineering and ETL processes. Familiarity with exploratory data analysis (EDA) techniques is essential. API Knowledge Intermediate experience with ChatGPT APIs or similar technologies is a plus, showcasing your ability to integrate AI solutions into data workflows. Business Intelligence Tools A good understanding of BI tools such as Qlik Sense, Power BI, or Tableau is necessary for effective data visualization and reporting. Data Extraction Expertise Proven ability to extract and manipulate data from diverse sources, including web platforms, PDFs, Excel files, and various databases. A broad understanding of analytics methodologies is crucial for transforming raw data into actionable insights. Analytical Mindset Strong analytical and problem-solving skills, with the ability to interpret complex data sets and communicate insights effectively to stakeholders. Adaptability to New Technologies A keen interest in AI and emerging technologies, with a willingness to learn and adapt to new tools and methodologies in the rapidly evolving data landscape. What makes you stand out Degree or Certification in Data Management, Statistics , Analytics and BI tools (Qlik Sense & Tableau) ( would be preferred ) Experience in Healthcare sector, working with Multination clients . Why join our team We help you be your best through professional development opportunities, interesting work and supportive leaders. We foster a vibrant and inclusive culture where you can work with talented colleagues to create new solutions and have impact for colleagues, clients and communities. Our scale enables us to provide a range of career opportunities, as well as benefits and rewards to enhance your well-being. Mercer, a business of Marsh McLennan (NYSEMMC), is a global leader in helping clients realize their investment objectives, shape the future of work and enhance health and retirement outcomes for their people. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit mercer.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.

Posted 3 months ago

Apply

2.0 - 6.0 years

4 - 6 Lacs

Pune, Chennai

Work from Office

Design, develop, execute automated test scripts for ETL pipelines Automate data validation processes between source and target systems Build reusable test automation frameworks for ETL workflows Create test data, write complex SQL queries Required Candidate profile Identify bottlenecks and data quality issues across the data pipeline Collaborate with developers, analysts, and QA team to define automation coverage Maintain test suites and update scripts Perks and benefits Perks and Benefits

Posted 3 months ago

Apply

2.0 - 6.0 years

4 - 6 Lacs

Pune, Chennai

Work from Office

Greetings from KVC CONSULTANTS LTD. HIRING FOR LEADING IT MNCs JOB DESCRIPTION Technical and Professional Requirements: > Experienced data testing profile with: SQL, ETL, Data layer validations, Microservices (API), Python / Java automation. > Good to have: Master Data Management (MDM), Cloud (Azure/AWS/GCP) Data Testing. > Good communication and stakeholder management. >Test automation (Graphical User Interface/ Service Oriented Architecture/ ETL) utilizing tools ( Selenium , Appium , Mocha, Test NG , CA Dev Test , Rest Assured , IDQ , DVO etc. Jenkins, TeamCity, GitLab" Preferred Skills: Technology->Automated Testing->Automated Testing - ALL->Cucumber Technology->Microservices->Microservices API Management Technology->Data Services Testing->Data Warehouse Testing->ETL tool Foundational->Configuration Management->Build Management->Jenkins Technology->OpenSystem->Python - OpenSystem->Python Technology->Mobile Testing->Mobile Functional Test Automation (iOS Android)->Appium ETL Testing + Python Programming Educational Requirements MCA, MSc, MTech, Bachelor of Engineering, BCA, BSc Thanks & Regards HRTeam KVC CONSULTANTS NO PLACEMENT CHARGES

Posted 3 months ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Job Track Description: Create and support complex DB Objects like Stored Procedures, Custom Functions and Triggers for both Data Warehouse and application development. Performance tuning of DB Objects, ETL and SSRS Reports. Design the ETL, Analysis and Reporting solutions. Independently analyze, solve, and correct issues in real time, providing problem resolution end-to-end. Create and support ETL processes using SSIS and SQL Server Jobs Convert legacy reports to modern technologies using Microsoft BI platform. Develop and maintain a thorough understanding of business needs from both technical and business perspectives. Guide the team members for the best solutions. Effectively prioritize and execute tasks in a high-pressure environment. Qualifications / Experience Bachelor\u2019s/Master\u2019s degree in Computer Science / Computer Engineering or equivalent experience 6+ years of experience in SQL Server 5+ years of experience in Microsoft BI Stack & Reporting solutions (mainly Power BI & SSRS) including development of reports (RDLs, T-SQL, complex stored procedures, subscriptions, MDX), and Dimensions, Facts & Cubes Data Modeling/Data Warehouse Design experience Experience in ETL packages using SSIS is desirable. Strong understanding of SDLC and version control Excellent oral and written communication skills Conduent is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, creed, religion, ancestry, national origin, age, gender identity, gender expression, sex/gender, marital status, sexual orientation, physical or mental disability, medical condition, use of a guide dog or service animal, military/veteran status, citizenship status, basis of genetic information, or any other group protected by law. People with disabilities who need a reasonable accommodation to apply for or compete for employment with Conduent may request such accommodation(s) by submitting their request through this form that must be downloaded:click here to access or download the form. Complete the form and then email it as an attachment toFTADAAA@conduent.com.You may alsoclick here to access Conduent's ADAAA Accommodation Policy. At Conduent we value the health and safety of our associates, their families and our community. For US applicants while we DO NOT require vaccination for most of our jobs, we DO require that you provide us with your vaccination status, where legally permissible. Providing this information is a requirement of your employment at Conduent.

Posted 3 months ago

Apply

4.0 - 7.0 years

8 - 15 Lacs

Pune

Work from Office

•Experience with ETL testing. •Ability to create data bricks notebooks to automate the manual tests •Ability to create and run Test pipelines and interpret the results. •Ability to test complex reports and write queries to check each metrics Required Candidate profile •Experience in Azure Databricks and SQL queries - ability to analyse data in a Data Warehouse environment. •Ability to test complex reports and write queries to check each metrics.

Posted 3 months ago

Apply

4 - 9 years

10 - 20 Lacs

Pune, Chennai, Coimbatore

Hybrid

Qualification: Strong SQL and data transformation skills Experience in programming or scripting languages such as Python, C#, Java, JavaScript/TypeScript Understanding of ETL/ELT process fundamentals Experience in designing, developing, and maintaining robust and scalable test automation frameworks such as Playwright, Selenium, or Cypress Experience in testing of data with tools such as PowerBI, DBT, and Snowflake (nice to have) Experience with GitHub Actions or similar platforms for automating and managing test workflows. Responsibilities: Translate project requirements into effective and comprehensive test cases. Define clear testing objectives that align with overall project goals. Establish the testing scope, prioritizing critical features and functionalities. Document expected deliverables, such as detailed test plans , scripts , and reports . Use dbt to build tests that ensure the ETL process is working as intended Automate the common manual testing done by the QA through creating macros in dbt Build and monitor automated system health checks Collaborate with Enterprise Data Engineers to investigate the root cause of issues and suggest resolutions. Orchestrate data testing solutions using Airflow Be able to support the team in doing releases Develop and maintain test automation frameworks, integrating them with CI/CD pipelines. Collaborate effectively with developers to implement testing strategies at lower levels, facilitating a "shift left" approach and promoting early defect detection. Take ownership of application quality from requirements gathering through development and testing, ensuring a high standard of product excellence

Posted 4 months ago

Apply

9 - 14 years

12 - 16 Lacs

Gurugram

Work from Office

Job Title:GN - SC&O - S&P – Spend Analytics – Senior Data Scientist Management Level:8 – Associate Manager Location:Gurgaon Must have skills:Data Handling, Data Classification, AI/ML Good to have Skills: Data Mining, Python Job Summary : As an Associate Manager in Spend Analytics and Senior Data Scientist, you will be responsible for leading the design, development, and implementation of AI/ML-powered procurement and analytics solutions. You will be working closely with cross-functional teams to conceptualize and deploy platforms that identify cost-saving opportunities, enhance supplier management, and deliver business intelligence to enterprise clients. Roles and Responsibilities: Data Handling Data Collection:Participate in calls with stakeholders (internal and external, based in the US) to gather data from various sources (email, Dropbox, Egnyte, databases). Data Audit:Lead assessment of data quality, identify gaps, and create summaries as per database requirements. Data Scrubbing:Assist with creating data cleansing rules and incorporate data clarifications provided by data source owners. Data Profiling:Assist with creating multi-dimensional data views, data analysis reports, and extracts. Data Classification: Spend Classification:Analyze procurement spend using several techniques to comprehensively classify into a custom taxonomy in Accenture's spend analytics tool. Enhancements:Diligently incorporate feedback and make recommendations for process improvement. Report Generation:Create specific and opportunity spend-assessment reports/templates. Periodic Refreshes:Lead discussions with US-based stakeholders for data gathering, data quality checks, control total validation, and spend classification. Advanced Analytics and AI/ML Develop custom data models and algorithms to apply to data sets. Use machine learning tools and statistical techniques to produce solutions to problems. Implement clustering and auto classification using predictive and supervised learning techniques. Design and implement complex data models from scratch. Develop and optimize ETL processes to ensure efficient data handling and processing. Create intuitive and effective front-end interfaces from scratch. Apply AI/ML techniques to optimize supply chain management, including demand forecasting, inventory optimization, and supplier performance analysis. Utilize advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Develop and implement AI/ML models for predictive analytics and automated decision-making in supply chain operations. Industry Research Secondary Research:Conduct market research to create company and industry primers using online secondary data or information sources. Professional and Technical Skills Python:Hands-on experience with threading limitations and multi-process architecture. MySQL:Ability to integrate multiple data sources using MySQL. Strong coding knowledge and experience with several languages (e.g., R, SQL, JavaScript, Java, CSS, C++). Familiarity with statistical and data mining techniques (e.g., GLM/Regression, Random Forest, Boosting, Trees, text mining, social network analysis). Experience with advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Coordinate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. Excellent spoken and written English communication skills, with the ability to participate in global team calls. Additional Information: Work Experience:9-11 years of relevant experience in data modeling, ETL automation, AI/ML, and front-end design. Academic Qualifications:Bachelor's or Master's degree in Engineering, Mathematics, Computer Science, or a related field. Extensive experience in handling and classifying spend data using AI/ML Techniques. Strong leadership and team management skills. Proficiency in MS Excel and MS PowerPoint. High attention to detail, accuracy, and innovative problem-solving skills. Preferred experience in supply chain management, with a focus on applying AI/ techniques to optimize operations. About Our Company | Accenture Qualification Experience:9+ years Educational Qualification:Bachelor’s or Master’s degree

Posted 4 months ago

Apply

3 - 4 years

10 - 14 Lacs

Bengaluru

Work from Office

We are seeking a Backend and Data Processing Engineer to join our team. In this role, you will be responsible to develop CRM system on AWS for data integration from each ERP system and connecting with the messaging hub system. You will work closely with front-end developers, data scientists, and other stakeholders to ensure that our systems are scalable, reliable, and performant. You will be responsible in creating a robust and secure AWS-based data lake by collecting diverse data sources and standardization. Experience Level: 3 ~ 4 years. Key Responsibilities: Standardize data and store/manage it in an AWS-based data lake. Collect customer data such as reservations, memberships, and payments from ERP systems across various locations. Implement automation for data pipelines and data consistency verification logic. Design marketing triggers based on customer journeys and implement automated scenario execution logic. Integrate with internal messaging hub systems to automatically send personalized SMS, emails, and notifications. Design structures for collecting data on message delivery results and user responses. Required Skill Set: Proficiency in backend development using Python, Node.js, etc. Experience designing and developing RESTful APIs. Skilled in SQL-based data analysis and database design. Experience in collecting and standardizing data from external systems such as ERP. Familiarity with ETL automation and scheduling. Experience in integrating with messaging systems. Qualifications: Bachelor or Master's degree in Computer Science or related field. Relevant backend and data pipeline building experience is crucial. Prior experience on CRM systems development / integration is preferred for this position

Posted 4 months ago

Apply

- 1 years

4 - 4 Lacs

Pune

Work from Office

•Candidate will be involved in both automated & manual ETL testing across multiple scrum teams. •Hands on experience in Writing SQL queries. •Good knowledge of writing python scripts. •Writing test scenarios and test scripts. Required Candidate profile •Good understanding of framework of ETL •Write SQL queries •Ability to write manual test cases from business requirements & technical specifications •Strong oral & written English communication skills

Posted 4 months ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

noida, gurugram, delhi / ncr

Work from Office

Job description: Hiring etl testing with experience range 4 years & above Mandatory Skills: etl testing>selenium>etl testing Education: B.Tech ->BCA -> B.sc

Posted Date not available

Apply

8.0 - 12.0 years

30 - 45 Lacs

coimbatore

Work from Office

We are seeking a highly skilled and detail-oriented Data Quality Engineer (Automation) to join our data engineering team. The ideal candidate will have a strong background in ETL automation, data warehouse testing, and cloud data services, along with hands-on experience in test automation frameworks and CI/CD practices. Responsibilities Design, develop, and execute automated test scripts for validating ETL workflows and data pipelines Perform end-to-end data validation, reconciliation, and DWH/DB testing across large datasets Collaborate with data engineers, analysts, and stakeholders to define test strategies and ensure data quality standards are met Develop reusable and scalable test automation frameworks using Python and SQL Integrate automated tests into CI/CD pipelines to ensure continuous testing and delivery Validate data transformation logic and completeness across different cloud platforms (Azure, AWS, GCP) Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and story grooming Requirements Experience: 9+ Years Strong hands-on experience in ETL automation and Data Warehouse/Data Lake testing Proficient in SQL for data validation and test automation Good knowledge of Python for scripting and framework development Experience with one or more Test Automation Frameworks Exposure to Cloud Data Services across major platforms: Azure (ADF, ADB), AWS (Glue, Lambda, Redshift), GCP (BigQuery) Understanding of CI/CD tools and practices (e.g., Jenkins, Git, Azure DevOps) Familiarity with Agile/Scrum methodologies Nice to have Experience working with data observability or data quality tools (e.g., Great Expectations, Deequ) Basic understanding of data governance and metadata management concepts Certification in cloud platforms (Azure/AWS/GCP) is a plus

Posted Date not available

Apply

8.0 - 12.0 years

30 - 45 Lacs

hyderabad, pune, bengaluru

Work from Office

We are seeking a highly skilled and detail-oriented Data Quality Engineer (Automation) to join our data engineering team. The ideal candidate will have a strong background in ETL automation, data warehouse testing, and cloud data services, along with hands-on experience in test automation frameworks and CI/CD practices. Responsibilities Design, develop, and execute automated test scripts for validating ETL workflows and data pipelines Perform end-to-end data validation, reconciliation, and DWH/DB testing across large datasets Collaborate with data engineers, analysts, and stakeholders to define test strategies and ensure data quality standards are met Develop reusable and scalable test automation frameworks using Python and SQL Integrate automated tests into CI/CD pipelines to ensure continuous testing and delivery Validate data transformation logic and completeness across different cloud platforms (Azure, AWS, GCP) Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and story grooming Requirements Experience: 9+ Years Strong hands-on experience in ETL automation and Data Warehouse/Data Lake testing Proficient in SQL for data validation and test automation Good knowledge of Python for scripting and framework development Experience with one or more Test Automation Frameworks Exposure to Cloud Data Services across major platforms: Azure (ADF, ADB), AWS (Glue, Lambda, Redshift), GCP (BigQuery) Understanding of CI/CD tools and practices (e.g., Jenkins, Git, Azure DevOps) Familiarity with Agile/Scrum methodologies Nice to have Experience working with data observability or data quality tools (e.g., Great Expectations, Deequ) Basic understanding of data governance and metadata management concepts Certification in cloud platforms (Azure/AWS/GCP) is a plus

Posted Date not available

Apply

8.0 - 12.0 years

30 - 45 Lacs

chennai

Work from Office

We are seeking a highly skilled and detail-oriented Data Quality Engineer (Automation) to join our data engineering team. The ideal candidate will have a strong background in ETL automation, data warehouse testing, and cloud data services, along with hands-on experience in test automation frameworks and CI/CD practices. Responsibilities Design, develop, and execute automated test scripts for validating ETL workflows and data pipelines Perform end-to-end data validation, reconciliation, and DWH/DB testing across large datasets Collaborate with data engineers, analysts, and stakeholders to define test strategies and ensure data quality standards are met Develop reusable and scalable test automation frameworks using Python and SQL Integrate automated tests into CI/CD pipelines to ensure continuous testing and delivery Validate data transformation logic and completeness across different cloud platforms (Azure, AWS, GCP) Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and story grooming Requirements Experience: 9+ Years Strong hands-on experience in ETL automation and Data Warehouse/Data Lake testing Proficient in SQL for data validation and test automation Good knowledge of Python for scripting and framework development Experience with one or more Test Automation Frameworks Exposure to Cloud Data Services across major platforms: Azure (ADF, ADB), AWS (Glue, Lambda, Redshift), GCP (BigQuery) Understanding of CI/CD tools and practices (e.g., Jenkins, Git, Azure DevOps) Familiarity with Agile/Scrum methodologies Nice to have Experience working with data observability or data quality tools (e.g., Great Expectations, Deequ) Basic understanding of data governance and metadata management concepts Certification in cloud platforms (Azure/AWS/GCP) is a plus

Posted Date not available

Apply

3.0 - 8.0 years

13 - 23 Lacs

hyderabad, pune, bengaluru

Work from Office

We are seeking a highly skilled and detail-oriented Data Quality Engineer (Automation) to join our data engineering team. The ideal candidate will have a strong background in ETL automation, data warehouse testing, and cloud data services, along with hands-on experience in test automation frameworks and CI/CD practices. Responsibilities Design, develop, and execute automated test scripts for validating ETL workflows and data pipelines Perform end-to-end data validation, reconciliation, and DWH/DB testing across large datasets Collaborate with data engineers, analysts, and stakeholders to define test strategies and ensure data quality standards are met Develop reusable and scalable test automation frameworks using Python and SQL Integrate automated tests into CI/CD pipelines to ensure continuous testing and delivery Validate data transformation logic and completeness across different cloud platforms (Azure, AWS, GCP) Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and story grooming Requirements Experience: 3 + Years Strong hands-on experience in ETL automation and Data Warehouse/Data Lake testing Proficient in SQL for data validation and test automation Good knowledge of Python for scripting and framework development Experience with one or more Test Automation Frameworks Exposure to Cloud Data Services across major platforms: Azure (ADF, ADB) AWS (Glue, Lambda, Redshift) GCP (BigQuery) Understanding of CI/CD tools and practices (e.g., Jenkins, Git, Azure DevOps) Familiarity with Agile/Scrum methodologies Nice to have Experience working with data observability or data quality tools (e.g., Great Expectations, Deequ) Basic understanding of data governance and metadata management concepts Certification in cloud platforms (Azure/AWS/GCP) is a plus

Posted Date not available

Apply

3.0 - 8.0 years

13 - 23 Lacs

chennai

Work from Office

We are seeking a highly skilled and detail-oriented Data Quality Engineer (Automation) to join our data engineering team. The ideal candidate will have a strong background in ETL automation, data warehouse testing, and cloud data services, along with hands-on experience in test automation frameworks and CI/CD practices. Responsibilities Design, develop, and execute automated test scripts for validating ETL workflows and data pipelines Perform end-to-end data validation, reconciliation, and DWH/DB testing across large datasets Collaborate with data engineers, analysts, and stakeholders to define test strategies and ensure data quality standards are met Develop reusable and scalable test automation frameworks using Python and SQL Integrate automated tests into CI/CD pipelines to ensure continuous testing and delivery Validate data transformation logic and completeness across different cloud platforms (Azure, AWS, GCP) Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and story grooming Requirements Experience: 3 + Years Strong hands-on experience in ETL automation and Data Warehouse/Data Lake testing Proficient in SQL for data validation and test automation Good knowledge of Python for scripting and framework development Experience with one or more Test Automation Frameworks Exposure to Cloud Data Services across major platforms: Azure (ADF, ADB) AWS (Glue, Lambda, Redshift) GCP (BigQuery) Understanding of CI/CD tools and practices (e.g., Jenkins, Git, Azure DevOps) Familiarity with Agile/Scrum methodologies Nice to have Experience working with data observability or data quality tools (e.g., Great Expectations, Deequ) Basic understanding of data governance and metadata management concepts Certification in cloud platforms (Azure/AWS/GCP) is a plus

Posted Date not available

Apply

3.0 - 8.0 years

13 - 23 Lacs

coimbatore

Work from Office

We are seeking a highly skilled and detail-oriented Data Quality Engineer (Automation) to join our data engineering team. The ideal candidate will have a strong background in ETL automation, data warehouse testing, and cloud data services, along with hands-on experience in test automation frameworks and CI/CD practices. Responsibilities Design, develop, and execute automated test scripts for validating ETL workflows and data pipelines Perform end-to-end data validation, reconciliation, and DWH/DB testing across large datasets Collaborate with data engineers, analysts, and stakeholders to define test strategies and ensure data quality standards are met Develop reusable and scalable test automation frameworks using Python and SQL Integrate automated tests into CI/CD pipelines to ensure continuous testing and delivery Validate data transformation logic and completeness across different cloud platforms (Azure, AWS, GCP) Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and story grooming Requirements Experience: 3 + Years Strong hands-on experience in ETL automation and Data Warehouse/Data Lake testing Proficient in SQL for data validation and test automation Good knowledge of Python for scripting and framework development Experience with one or more Test Automation Frameworks Exposure to Cloud Data Services across major platforms Azure (ADF, ADB) AWS (Glue, Lambda, Redshift) GCP (BigQuery) Understanding of CI/CD tools and practices (e.g., Jenkins, Git, Azure DevOps) Familiarity with Agile/Scrum methodologies Nice to have Experience working with data observability or data quality tools (e.g., Great Expectations, Deequ) Basic understanding of data governance and metadata management concepts Certification in cloud platforms (Azure/AWS/GCP) is a plus

Posted Date not available

Apply

5.0 - 8.0 years

20 - 33 Lacs

hyderabad, pune, bengaluru

Work from Office

We are seeking a highly skilled and detail-oriented Data Quality Engineer (Automation) to join our data engineering team. The ideal candidate will have a strong background in ETL automation, data warehouse testing, and cloud data services, along with hands-on experience in test automation frameworks and CI/CD practices. Responsibilities Design, develop, and execute automated test scripts for validating ETL workflows and data pipelines Perform end-to-end data validation, reconciliation, and DWH/DB testing across large datasets Collaborate with data engineers, analysts, and stakeholders to define test strategies and ensure data quality standards are met Develop reusable and scalable test automation frameworks using Python and SQL Integrate automated tests into CI/CD pipelines to ensure continuous testing and delivery Validate data transformation logic and completeness across different cloud platforms (Azure, AWS, GCP) Participate in Agile ceremonies and contribute to sprint planning, retrospectives, and story grooming Requirements Experience: 5.5 + Years Strong hands-on experience in ETL automation and Data Warehouse/Data Lake testing Proficient in SQL for data validation and test automation Good knowledge of Python for scripting and framework development Experience with one or more Test Automation Frameworks Exposure to Cloud Data Services across major platforms Azure (ADF, ADB) AWS (Glue, Lambda, Redshift) GCP (BigQuery) Understanding of CI/CD tools and practices (e.g., Jenkins, Git, Azure DevOps) Familiarity with Agile/Scrum methodologies Nice to have Experience working with data observability or data quality tools (e.g., Great Expectations, Deequ) Basic understanding of data governance and metadata management concepts Certification in cloud platforms (Azure/AWS/GCP) is a plus

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies