Home
Jobs

25 Etl Automation Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Job Track Description: Create and support complex DB Objects like Stored Procedures, Custom Functions and Triggers for both Data Warehouse and application development. Performance tuning of DB Objects, ETL and SSRS Reports. Design the ETL, Analysis and Reporting solutions. Independently analyze, solve, and correct issues in real time, providing problem resolution end-to-end. Create and support ETL processes using SSIS and SQL Server Jobs Convert legacy reports to modern technologies using Microsoft BI platform. Develop and maintain a thorough understanding of business needs from both technical and business perspectives. Guide the team members for the best solutions. Effectively prioritize and execute tasks in a high-pressure environment. Qualifications / Experience Bachelor\u2019s/Master\u2019s degree in Computer Science / Computer Engineering or equivalent experience 6+ years of experience in SQL Server 5+ years of experience in Microsoft BI Stack & Reporting solutions (mainly Power BI & SSRS) including development of reports (RDLs, T-SQL, complex stored procedures, subscriptions, MDX), and Dimensions, Facts & Cubes Data Modeling/Data Warehouse Design experience Experience in ETL packages using SSIS is desirable. Strong understanding of SDLC and version control Excellent oral and written communication skills

Posted 6 days ago

Apply

3.0 - 10.0 years

1 - 2 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Our client is an EU subsidiary of a Global Financial Bank working in multiple markets and asset classes. DWH / ETL Tester will work closely with the Development Team to design, build interfaces and integrate data from a variety from internal and external data sources into the new Enterprise Data Warehouse environment. The ETL Tester will be primarily responsible for testing Enterprise Data Warehouse using Automation within industry recognized ETL standards, architecture, and best practices. Responsibilities Perform intake of new ETL Project & initiatives, make the high-level assessment in collaboration with the leadership of the roadmap. Design Test Strategy and Test Plan to address the needs of Cloud Based ETL Pipelines. Contribute and manage Testing Deliverables. Ensure the implementation of test standards and best practices for the agile model & contributes to their development. Engage with internal stakeholders in various areas of the organization to seek alignment and collaboration. Deals with external stakeholders / Vendors. Identify risks / issues and present associated mitigating actions taking into account the critically of the domain of the underlying business. Contribute to continuous improvement of testing standard processes. Skills Expert level knowledge on Data Warehouse, RDBMS concepts. Expertise on new age cloud-based Data Warehouse solutions ADF, SnowFlake, GCP etc. Hands-On expertise in writing complex SQL using multiple JOINS and highly complex functions to test various transformations and ETL requirements. Knowledge and Experience on creating Test Automation for Database and ETL Testing Regression Suite. Automation using Selenium with Python (or Java Script), Python Scripts, Shell Script. Knowledge of framework designing, REST API Testing of databases using Python. Experience using Atlassian tool set, Azure DevOps. Experience in Code & Version Management GIT, Bitbucket, Azure Repos etc. Qualifications A bachelor's degree or equivalent experience in computer science or similar. Experience in crafting test strategies and supervising ETL DWH test activities on multi-platform & sophisticated Cloud based environments. Strong analytical mind-set with the ability to extract relevant information from documentation, system data, clients and colleagues and analyze the captured information. ISTQB Foundation Certificate in Software testing Optional/Preferred experience in the financial industry, knowledge of Regulatory Reporting and the terms/terminology used. Important to Have Proficiency in English read/write/speak. Able to demonstrate your ability to learn new technologies. Able to easily adapt to new circumstances / technologies / procedures. Stress resistant and constructive whatever the context. Able to align with existing standards and acting with attention to detail A true standout colleague who demonstrates good interpersonal skills. Able to summarize complex technical situations in simple terms. Solution and customer focused. Good communication skills, a positive attitude, and a competitive, but team-oriented focus are key elements to be successful in this challenging environment. Nice to have Experience in the financial industry, knowledge of Regulatory Reporting and the terms/terminology used.

Posted 1 week ago

Apply

3.0 - 5.0 years

6 - 8 Lacs

Chandigarh

Work from Office

Naukri logo

Role Overview We are seeking a talented ETL Engineer to design, implement, and maintain end-to-end data ingestion and transformation pipelines in Google Cloud Platform (GCP). This role will collaborate closely with data architects, analysts, and BI developers to ensure high-quality, performant data delivery into BigQuery and downstream Power BI reporting layers. Key Responsibilities Data Ingestion & Landing Architect and implement landing zones in Cloud Storage for raw data. Manage buckets/objects and handle diverse file formats (Parquet, Avro, CSV, JSON, ORC). ETL Pipeline Development Build and orchestrate extraction, transformation, and loading workflows using Cloud Data Fusion. Leverage Data Fusion Wrangler for data cleansing, filtering, imputation, normalization, type conversion, splitting, joining, sorting, union, pivot/unpivot, and format adjustments. Data Modeling Design and maintain fact and dimension tables using Star and Snowflake schemas. Collaborate on semantic layer definitions to support downstream reporting. Load & Orchestration Load curated datasets into BigQuery across different zones (raw, staging, curated). Develop SQL-based orchestration and transformation within BigQuery (scheduled queries, scripting). Performance & Quality Optimize ETL jobs for throughput, cost, and reliability. Implement monitoring, error handling, and data quality checks. Collaboration & Documentation Work with data analysts and BI developers to understand requirements and ensure data readiness for Power BI. Maintain clear documentation of pipeline designs, data lineage, and operational runbooks. Required Skills & Experience Bachelors degree in Computer Science, Engineering, or related field. 3+ years of hands-on experience building ETL pipelines in GCP. Proficiency with Cloud Data Fusion , including Wrangler transformations. Strong command of SQL , including performance tuning in BigQuery. Experience managing Cloud Storage buckets and handling Parquet, Avro, CSV, JSON, and ORC formats. Solid understanding of dimensional modeling: fact vs. dimension tables, Star and Snowflake schemas. Familiarity with BigQuery data zones (raw, staging, curated) and dataset organization. Experience with scheduling and orchestration tools (Cloud Composer, Airflow, or BigQuery scheduled queries). Excellent problem-solving skills and attention to detail. Preferred (Good to Have) Exposure to Power BI data modeling and DAX. Experience with other GCP services (Dataflow, Dataproc). Familiarity with Git, CI/CD pipelines, and infrastructure as code (Terraform). Knowledge of Python for custom transformations or orchestration scripts. Understanding of data governance best practices and metadata management.

Posted 1 week ago

Apply

2.0 - 7.0 years

1 - 6 Lacs

Pune, Chennai

Work from Office

Naukri logo

Role: TSO Exp: Min 2 years Must have Exp in Data Migration Testing (ETL Testing), Manual & Automation, Agile Scrum Methodology and Financial domain experience Salary: Max 5.25 LPA. Location: Chennai/Pune Regards, Vicky 7200396456.

Posted 1 week ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Data Scientist (6-8 Years Experience) Company Overview Accenture is a global professional services company with leading capabilities in digital, cloud, and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Interactive, Technology, and Operations services"”all powered by the world's largest network of Advanced Technology and Intelligent Operations centers. Our 700,000+ people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners, and communities. At Accenture, we offer a dynamic and challenging environment for analytical and creative minds. We value a diverse workforce committed to achieving client goals and delivering results. Were built on a foundation of exceptional talent and are on the lookout for driven, energetic professionals to enhance our team. Key Responsibilities Data Handling Data Collection:Participate in calls with stakeholders (internal and external, based in the US) to gather data from various sources (email, Dropbox, Egnyte, databases). Data Audit:Lead assessment of data quality, identify gaps, and create summaries as per database requirements. Data Scrubbing:Assist with creating data cleansing rules and incorporate data clarifications provided by data source owners. Data Profiling:Assist with creating multi-dimensional data views, data analysis reports, and extracts. Data Classification Spend Classification:Analyze procurement spend using several techniques to comprehensively classify into a custom taxonomy in Accenture's spend analytics tool. Enhancements:Diligently incorporate feedback and make recommendations for process improvement. Report Generation:Create specific and opportunity spend-assessment reports/templates. Periodic Refreshes:Lead discussions with US-based stakeholders for data gathering, data quality checks, control total validation, and spend classification. Advanced Analytics and AI/ML Develop custom data models and algorithms to apply to data sets. Use machine learning tools and statistical techniques to produce solutions to problems. Implement clustering and auto classification using predictive and supervised learning techniques. Design and implement complex data models from scratch. Develop and optimize ETL processes to ensure efficient data handling and processing. Create intuitive and effective front-end interfaces from scratch. Apply AI/ML techniques to optimize supply chain management, including demand forecasting, inventory optimization, and supplier performance analysis. Utilize advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Develop and implement AI/ML models for predictive analytics and automated decision-making in supply chain operations. Industry Research Secondary Research:Conduct market research to create company and industry primers using online secondary data or information sources. Technical Requirements Python:Hands-on experience with threading limitations and multi-process architecture. MySQL:Ability to integrate multiple data sources using MySQL. Strong coding knowledge and experience with several languages (e.g., R, SQL, JavaScript, Java, CSS, C++). Familiarity with statistical and data mining techniques (e.g., GLM/Regression, Random Forest, Boosting, Trees, text mining, social network analysis). Experience with advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Collaboration and Communication Coordinate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. Excellent spoken and written English communication skills, with the ability to participate in global team calls. Other Important Details Work Location:Gurugram, India Expected Start Dates:July 2024 Qualifications Who Should Apply? Work Experience:6-8 years of relevant experience in data modeling, ETL automation, AI/ML, and front-end design. Academic Qualifications:Bachelor’s or Master’s degree in Engineering, Mathematics, Computer Science, or a related field. Job Requirements: - Extensive experience in handling and classifying spend data using AI/ML Techniques. - Strong leadership and team management skills. - Proficiency in MS Excel and MS PowerPoint. - High attention to detail, accuracy, and innovative problem-solving skills. - Preferred experience in supply chain management, with a focus on applying AI/ML techniques to optimize operations.

Posted 1 week ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Chennai, Coimbatore

Hybrid

Naukri logo

Greetings from Cognizant!!! We are hiring for Permanent Position with Cognizant. Experience:- 3 - 6 Yrs. Mandatory to have experience in ETL Testing, Mongo DB, Python, SQL, Azure databricks . Work Location: Chennai, Coimbatore Interview Mode : ( Virtual ) Interview Date : Weekday & Weekend

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Experience:- 5-11 years Location:-Bangalore/Chennai/Hyderabad/Pune (F2F on 7th June) Notice Period:- Immediate Key Responsibilities: - Design and execute test plans, test cases, and test scripts for ETL processes - Validate data quality, integrity, and accuracy in source and target systems - Identify and report defects, and collaborate with ETL developers to resolve issues - Perform data profiling, data validation, and data reconciliation - Develop and maintain test automation frameworks and scripts - Lead and mentor junior testers, and provide guidance on testing best practices - Collaborate with stakeholders to understand testing requirements and ensure compliance with testing standards

Posted 2 weeks ago

Apply

3.0 - 5.0 years

13 - 18 Lacs

Mumbai, Pune

Work from Office

Naukri logo

locationsMumbai - HiranandaniPune - Business Bayposted onPosted 3 Days Ago time left to applyEnd DateJune 30, 2025 (30 days left to apply) job requisition idR_307045 Company: Marsh Description: We are seeking a talented individual to join our Benefit Analytics Team at Marsh. This role will be based in Mumbai. This is a hybrid role that has a requirement of working at least three days a week in the office. Principal Engineer - Data Science We will count on you to: Design and implement data analytics products that utilize web-based technologies to solve complex business problems and drive strategic outcomes. Utilize strong conceptual skills to explore the "Art of the Possible" in analytics, integrating data, market trends, and cutting-edge technologies to inform business strategies. Manage and manipulate large datasets from diverse sources, ensuring data quality through cleaning, consolidation, and transformation into meaningful insights. Conduct exploratory data analysis (EDA) to identify patterns and trends, reporting key metrics and synthesizing disparate datasets for comprehensive insights. Perform rigorous quality assurance (QA) on datasets, ensuring accuracy, logical consistency, and alignment with analytical dashboards. Automate data capture processes from various sources, streamlining data cleaning and insight generation workflows. Apply knowledge of insurance claims, policies, terminologies, health risks, and wellbeing to enhance analytical models and insights. Collaborate with cross-functional teams to develop and deploy machine learning models and predictive analytics solutions. Utilize SQL for database management and data manipulation, with a focus on optimizing queries and data retrieval processes. Develop ETL Automation pipelines using tools such as Python, GenAI and ChatGPT APIs ensuring efficient and optimized code. Communicate complex data-driven solutions clearly and effectively, translating technical findings into actionable business recommendations. Having knowledge around LLM/RAG/Power BI/Tableau will be preferred What you need to have: Educational Background A Bachelors or Masters degree in Computer Science, Information Technology, Mathematics, Statistics, or a related field is essential. A strong academic foundation will support your analytical and technical skills. Experience 3-5 years of progressive experience in a Data Science or Data Analytics role, demonstrating a solid track record of delivering impactful data-driven insights and solutions. Technical Proficiency : Programming Skills Advanced proficiency in Python is required, with hands-on experience in data engineering and ETL processes. Familiarity with exploratory data analysis (EDA) techniques is essential. API Knowledge Intermediate experience with ChatGPT APIs or similar technologies is a plus, showcasing your ability to integrate AI solutions into data workflows. Business Intelligence Tools A good understanding of BI tools such as Qlik Sense, Power BI, or Tableau is necessary for effective data visualization and reporting. Data Extraction Expertise Proven ability to extract and manipulate data from diverse sources, including web platforms, PDFs, Excel files, and various databases. A broad understanding of analytics methodologies is crucial for transforming raw data into actionable insights. Analytical Mindset Strong analytical and problem-solving skills, with the ability to interpret complex data sets and communicate insights effectively to stakeholders. Adaptability to New Technologies A keen interest in AI and emerging technologies, with a willingness to learn and adapt to new tools and methodologies in the rapidly evolving data landscape. What makes you stand out Degree or Certification in Data Management, Statistics , Analytics and BI tools (Qlik Sense & Tableau) ( would be preferred ) Experience in Healthcare sector, working with Multination clients . Why join our team We help you be your best through professional development opportunities, interesting work and supportive leaders. We foster a vibrant and inclusive culture where you can work with talented colleagues to create new solutions and have impact for colleagues, clients and communities. Our scale enables us to provide a range of career opportunities, as well as benefits and rewards to enhance your well-being. Mercer, a business of Marsh McLennan (NYSEMMC), is a global leader in helping clients realize their investment objectives, shape the future of work and enhance health and retirement outcomes for their people. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit mercer.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 6 Lacs

Pune, Chennai

Work from Office

Naukri logo

Design, develop, execute automated test scripts for ETL pipelines Automate data validation processes between source and target systems Build reusable test automation frameworks for ETL workflows Create test data, write complex SQL queries Required Candidate profile Identify bottlenecks and data quality issues across the data pipeline Collaborate with developers, analysts, and QA team to define automation coverage Maintain test suites and update scripts Perks and benefits Perks and Benefits

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 6 Lacs

Pune, Chennai

Work from Office

Naukri logo

Greetings from KVC CONSULTANTS LTD. HIRING FOR LEADING IT MNCs JOB DESCRIPTION Technical and Professional Requirements: > Experienced data testing profile with: SQL, ETL, Data layer validations, Microservices (API), Python / Java automation. > Good to have: Master Data Management (MDM), Cloud (Azure/AWS/GCP) Data Testing. > Good communication and stakeholder management. >Test automation (Graphical User Interface/ Service Oriented Architecture/ ETL) utilizing tools ( Selenium , Appium , Mocha, Test NG , CA Dev Test , Rest Assured , IDQ , DVO etc. Jenkins, TeamCity, GitLab" Preferred Skills: Technology->Automated Testing->Automated Testing - ALL->Cucumber Technology->Microservices->Microservices API Management Technology->Data Services Testing->Data Warehouse Testing->ETL tool Foundational->Configuration Management->Build Management->Jenkins Technology->OpenSystem->Python - OpenSystem->Python Technology->Mobile Testing->Mobile Functional Test Automation (iOS Android)->Appium ETL Testing + Python Programming Educational Requirements MCA, MSc, MTech, Bachelor of Engineering, BCA, BSc Thanks & Regards HRTeam KVC CONSULTANTS NO PLACEMENT CHARGES

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Job Track Description: Create and support complex DB Objects like Stored Procedures, Custom Functions and Triggers for both Data Warehouse and application development. Performance tuning of DB Objects, ETL and SSRS Reports. Design the ETL, Analysis and Reporting solutions. Independently analyze, solve, and correct issues in real time, providing problem resolution end-to-end. Create and support ETL processes using SSIS and SQL Server Jobs Convert legacy reports to modern technologies using Microsoft BI platform. Develop and maintain a thorough understanding of business needs from both technical and business perspectives. Guide the team members for the best solutions. Effectively prioritize and execute tasks in a high-pressure environment. Qualifications / Experience Bachelor\u2019s/Master\u2019s degree in Computer Science / Computer Engineering or equivalent experience 6+ years of experience in SQL Server 5+ years of experience in Microsoft BI Stack & Reporting solutions (mainly Power BI & SSRS) including development of reports (RDLs, T-SQL, complex stored procedures, subscriptions, MDX), and Dimensions, Facts & Cubes Data Modeling/Data Warehouse Design experience Experience in ETL packages using SSIS is desirable. Strong understanding of SDLC and version control Excellent oral and written communication skills Conduent is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, creed, religion, ancestry, national origin, age, gender identity, gender expression, sex/gender, marital status, sexual orientation, physical or mental disability, medical condition, use of a guide dog or service animal, military/veteran status, citizenship status, basis of genetic information, or any other group protected by law. People with disabilities who need a reasonable accommodation to apply for or compete for employment with Conduent may request such accommodation(s) by submitting their request through this form that must be downloaded:click here to access or download the form. Complete the form and then email it as an attachment toFTADAAA@conduent.com.You may alsoclick here to access Conduent's ADAAA Accommodation Policy. At Conduent we value the health and safety of our associates, their families and our community. For US applicants while we DO NOT require vaccination for most of our jobs, we DO require that you provide us with your vaccination status, where legally permissible. Providing this information is a requirement of your employment at Conduent.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

8 - 15 Lacs

Pune

Work from Office

Naukri logo

•Experience with ETL testing. •Ability to create data bricks notebooks to automate the manual tests •Ability to create and run Test pipelines and interpret the results. •Ability to test complex reports and write queries to check each metrics Required Candidate profile •Experience in Azure Databricks and SQL queries - ability to analyse data in a Data Warehouse environment. •Ability to test complex reports and write queries to check each metrics.

Posted 2 weeks ago

Apply

4 - 9 years

10 - 20 Lacs

Pune, Chennai, Coimbatore

Hybrid

Naukri logo

Qualification: Strong SQL and data transformation skills Experience in programming or scripting languages such as Python, C#, Java, JavaScript/TypeScript Understanding of ETL/ELT process fundamentals Experience in designing, developing, and maintaining robust and scalable test automation frameworks such as Playwright, Selenium, or Cypress Experience in testing of data with tools such as PowerBI, DBT, and Snowflake (nice to have) Experience with GitHub Actions or similar platforms for automating and managing test workflows. Responsibilities: Translate project requirements into effective and comprehensive test cases. Define clear testing objectives that align with overall project goals. Establish the testing scope, prioritizing critical features and functionalities. Document expected deliverables, such as detailed test plans , scripts , and reports . Use dbt to build tests that ensure the ETL process is working as intended Automate the common manual testing done by the QA through creating macros in dbt Build and monitor automated system health checks Collaborate with Enterprise Data Engineers to investigate the root cause of issues and suggest resolutions. Orchestrate data testing solutions using Airflow Be able to support the team in doing releases Develop and maintain test automation frameworks, integrating them with CI/CD pipelines. Collaborate effectively with developers to implement testing strategies at lower levels, facilitating a "shift left" approach and promoting early defect detection. Take ownership of application quality from requirements gathering through development and testing, ensuring a high standard of product excellence

Posted 1 month ago

Apply

9 - 14 years

12 - 16 Lacs

Gurugram

Work from Office

Naukri logo

Job Title:GN - SC&O - S&P – Spend Analytics – Senior Data Scientist Management Level:8 – Associate Manager Location:Gurgaon Must have skills:Data Handling, Data Classification, AI/ML Good to have Skills: Data Mining, Python Job Summary : As an Associate Manager in Spend Analytics and Senior Data Scientist, you will be responsible for leading the design, development, and implementation of AI/ML-powered procurement and analytics solutions. You will be working closely with cross-functional teams to conceptualize and deploy platforms that identify cost-saving opportunities, enhance supplier management, and deliver business intelligence to enterprise clients. Roles and Responsibilities: Data Handling Data Collection:Participate in calls with stakeholders (internal and external, based in the US) to gather data from various sources (email, Dropbox, Egnyte, databases). Data Audit:Lead assessment of data quality, identify gaps, and create summaries as per database requirements. Data Scrubbing:Assist with creating data cleansing rules and incorporate data clarifications provided by data source owners. Data Profiling:Assist with creating multi-dimensional data views, data analysis reports, and extracts. Data Classification: Spend Classification:Analyze procurement spend using several techniques to comprehensively classify into a custom taxonomy in Accenture's spend analytics tool. Enhancements:Diligently incorporate feedback and make recommendations for process improvement. Report Generation:Create specific and opportunity spend-assessment reports/templates. Periodic Refreshes:Lead discussions with US-based stakeholders for data gathering, data quality checks, control total validation, and spend classification. Advanced Analytics and AI/ML Develop custom data models and algorithms to apply to data sets. Use machine learning tools and statistical techniques to produce solutions to problems. Implement clustering and auto classification using predictive and supervised learning techniques. Design and implement complex data models from scratch. Develop and optimize ETL processes to ensure efficient data handling and processing. Create intuitive and effective front-end interfaces from scratch. Apply AI/ML techniques to optimize supply chain management, including demand forecasting, inventory optimization, and supplier performance analysis. Utilize advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Develop and implement AI/ML models for predictive analytics and automated decision-making in supply chain operations. Industry Research Secondary Research:Conduct market research to create company and industry primers using online secondary data or information sources. Professional and Technical Skills Python:Hands-on experience with threading limitations and multi-process architecture. MySQL:Ability to integrate multiple data sources using MySQL. Strong coding knowledge and experience with several languages (e.g., R, SQL, JavaScript, Java, CSS, C++). Familiarity with statistical and data mining techniques (e.g., GLM/Regression, Random Forest, Boosting, Trees, text mining, social network analysis). Experience with advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Coordinate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. Excellent spoken and written English communication skills, with the ability to participate in global team calls. Additional Information: Work Experience:9-11 years of relevant experience in data modeling, ETL automation, AI/ML, and front-end design. Academic Qualifications:Bachelor's or Master's degree in Engineering, Mathematics, Computer Science, or a related field. Extensive experience in handling and classifying spend data using AI/ML Techniques. Strong leadership and team management skills. Proficiency in MS Excel and MS PowerPoint. High attention to detail, accuracy, and innovative problem-solving skills. Preferred experience in supply chain management, with a focus on applying AI/ techniques to optimize operations. About Our Company | Accenture Qualification Experience:9+ years Educational Qualification:Bachelor’s or Master’s degree

Posted 1 month ago

Apply

3 - 4 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a Backend and Data Processing Engineer to join our team. In this role, you will be responsible to develop CRM system on AWS for data integration from each ERP system and connecting with the messaging hub system. You will work closely with front-end developers, data scientists, and other stakeholders to ensure that our systems are scalable, reliable, and performant. You will be responsible in creating a robust and secure AWS-based data lake by collecting diverse data sources and standardization. Experience Level: 3 ~ 4 years. Key Responsibilities: Standardize data and store/manage it in an AWS-based data lake. Collect customer data such as reservations, memberships, and payments from ERP systems across various locations. Implement automation for data pipelines and data consistency verification logic. Design marketing triggers based on customer journeys and implement automated scenario execution logic. Integrate with internal messaging hub systems to automatically send personalized SMS, emails, and notifications. Design structures for collecting data on message delivery results and user responses. Required Skill Set: Proficiency in backend development using Python, Node.js, etc. Experience designing and developing RESTful APIs. Skilled in SQL-based data analysis and database design. Experience in collecting and standardizing data from external systems such as ERP. Familiarity with ETL automation and scheduling. Experience in integrating with messaging systems. Qualifications: Bachelor or Master's degree in Computer Science or related field. Relevant backend and data pipeline building experience is crucial. Prior experience on CRM systems development / integration is preferred for this position

Posted 1 month ago

Apply

- 1 years

4 - 4 Lacs

Pune

Work from Office

Naukri logo

•Candidate will be involved in both automated & manual ETL testing across multiple scrum teams. •Hands on experience in Writing SQL queries. •Good knowledge of writing python scripts. •Writing test scenarios and test scripts. Required Candidate profile •Good understanding of framework of ETL •Write SQL queries •Ability to write manual test cases from business requirements & technical specifications •Strong oral & written English communication skills

Posted 1 month ago

Apply

5 - 7 years

12 - 18 Lacs

Mumbai

Work from Office

Naukri logo

Workday Integration Analyst, ETL, APIs, and data warehousing . managing HR system integrations, troubleshooting issues, ensuring data accuracy , EIB, Workday Studio, XML, Java, and Web Services (SOAP, REST, WSDL) Share your CV @ HR@Akriyagroup.com

Posted 2 months ago

Apply

6 - 11 years

10 - 18 Lacs

Bengaluru

Hybrid

Naukri logo

We are seeking a talented and detail-oriented ETL Automation Tester to join our dynamic testing team. The ideal candidate will have a strong background in automation testing, specifically for ETL processes, and proficiency in Selenium, Azure Databricks, Python or Java, and API automation. As an ETL Automation Tester, you will be responsible for ensuring the quality, performance, and scalability of data integration workflows across multiple systems. Required Skills & Qualifications: Experience: Minimum of 5 years of experience in ETL testing or data testing roles, with a strong focus on automation. Automation Testing: Proven experience in test automation frameworks and tools, specifically Selenium (for web-based applications) and Python or Java (for scripting). Azure Databricks: Hands-on experience with Azure Databricks and working with Spark-based data transformations. API Automation: Strong knowledge and experience in API testing and automation tools (e.g., Postman, RestAssured, or similar). SQL Knowledge: Strong SQL skills for data validation and working with relational and non-relational databases. ETL Tools: Familiarity with popular ETL tools (e.g., Talend, Informatica, SSIS) and data integration concepts. Version Control: Experience with version control systems such as Git or SVN . CI/CD Integration: Experience with CI/CD pipelines and tools like Jenkins, Azure DevOps, or GitLab CI. Communication: Excellent written and verbal communication skills to document and report issues clearly. Note: Candidate must Immediate or 15 days joiner.

Posted 3 months ago

Apply

4 - 9 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

Good communication skills & client exposure Understanding of ETL process Good to have knowledge in SQL/DB2 Strong SQL skills, including writing complex queries, joins, and subqueries Experience with data warehousing concepts, data modeling, and data quality principles Preparing test plan & test estimations Write test scenario and test cases Experience with Agile methodologies (Scrum, Kanban). Ability to work independently and as part of a team Validates data by running queries in the database and verify the results match the expectations Analyzes and troubleshoots erroneous results, determines the root causes, logs the defects and enables defect management, including working with the development team on the resolution of Defects Solid understanding of databases and systems and the ability to communicate effectively across groups Self Starter, with strong confidence level Flexible in thought and creative in approach to testing Added advantage - Preferable Insurance (or financial) Domain knowledge Having exposure in performance testing

Posted 3 months ago

Apply

1 - 3 years

2 - 6 Lacs

Noida

Work from Office

Naukri logo

Role & responsibilities : Receive and Process/produce new media, following strict handling guidelines. Transfer data to and from work servers. Validate and process data using series of SQL scripts and eDisocvery std, applications/tools. Analyze and resolve data integrity issues Make accurate and timely posts to our work request tracking systems. Generate a variety of reports based on special requests Work with other departments to resolve issues associated with specific data loads Work closely with team members to ensure ongoing jobs are completed within the defined timelines. Work to improve efficiencies by identifying bottlenecks and working with department manager to further automate processes Work closely with management on high priority and special projects Adhere to CLCS operational guidelines and processes to ensure work is done consistently Preferred candidate profile : 1-3 years of experience in eDiscovery including relevant working knowledge of at least one eDiscovery review platform (Viewpoint or Relativity). Excellent communication skills both written & verbal to interact seamlessly with co-workers. Education - should be from a technical background (B.sc in Computer Science, B.Tech), Possesses good knowledge on eDiscovery platforms/tools like LAW, ViewPoint & Relativity. Should be a strong Team player and wiliness to learn quickly. Interested candidates can DM or share their resume at sakshi.srivastava@conduent.com with below details Total Experience- Notice Period- Current Location- Current CTC-

Posted 3 months ago

Apply

8 - 11 years

10 - 13 Lacs

Gurgaon

Work from Office

Naukri logo

Senior Data Scientist (8-11 Years Experience) Company Overview Accenture is a global professional services company with leading capabilities in digital, cloud, and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Interactive, Technology, and Operations services"”all powered by the world's largest network of Advanced Technology and Intelligent Operations centers. Our 700,000+ people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners, and communities. At Accenture, we offer a dynamic and challenging environment for analytical and creative minds. We value a diverse workforce committed to achieving client goals and delivering results. Were built on a foundation of exceptional talent and are on the lookout for driven, energetic professionals to enhance our team. Key Responsibilities Data Handling Data Collection:Participate in calls with stakeholders (internal and external, based in the US) to gather data from various sources (email, Dropbox, Egnyte, databases). Data Audit:Lead assessment of data quality, identify gaps, and create summaries as per database requirements. Data Scrubbing:Assist with creating data cleansing rules and incorporate data clarifications provided by data source owners. Data Profiling:Assist with creating multi-dimensional data views, data analysis reports, and extracts. Data Classification Spend Classification:Analyze procurement spend using several techniques to comprehensively classify into a custom taxonomy in Accenture's spend analytics tool. Enhancements:Diligently incorporate feedback and make recommendations for process improvement. Report Generation:Create specific and opportunity spend-assessment reports/templates. Periodic Refreshes:Lead discussions with US-based stakeholders for data gathering, data quality checks, control total validation, and spend classification. Advanced Analytics and AI/ML Develop custom data models and algorithms to apply to data sets. Use machine learning tools and statistical techniques to produce solutions to problems. Implement clustering and auto classification using predictive and supervised learning techniques. Design and implement complex data models from scratch. Develop and optimize ETL processes to ensure efficient data handling and processing. Create intuitive and effective front-end interfaces from scratch. Apply AI/ML techniques to optimize supply chain management, including demand forecasting, inventory optimization, and supplier performance analysis. Utilize advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Develop and implement AI/ML models for predictive analytics and automated decision-making in supply chain operations. Industry Research Secondary Research:Conduct market research to create company and industry primers using online secondary data or information sources. Technical Requirements Python:Hands-on experience with threading limitations and multi-process architecture. MySQL:Ability to integrate multiple data sources using MySQL. Strong coding knowledge and experience with several languages (e.g., R, SQL, JavaScript, Java, CSS, C++). Familiarity with statistical and data mining techniques (e.g., GLM/Regression, Random Forest, Boosting, Trees, text mining, social network analysis). Experience with advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Collaboration and Communication Coordinate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. Excellent spoken and written English communication skills, with the ability to participate in global team calls. Other Important Details Work Location:Gurugram, India Expected Start Dates:July 2024 Qualifications Who Should Apply? Work Experience:8-11 years of relevant experience in data modeling, ETL automation, AI/ML, and front-end design. Academic Qualifications:Bachelor’s or Master’s degree in Engineering, Mathematics, Computer Science, or a related field. Job Requirements: - Extensive experience in handling and classifying spend data using AI/ML Techniques. - Strong leadership and team management skills. - Proficiency in MS Excel and MS PowerPoint. - High attention to detail, accuracy, and innovative problem-solving skills. - Preferred experience in supply chain management, with a focus on applying AI/ML techniques to optimize operations.

Posted 3 months ago

Apply

6 - 8 years

8 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Data Scientist (6-8 Years Experience) Company Overview Accenture is a global professional services company with leading capabilities in digital, cloud, and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Interactive, Technology, and Operations services"”all powered by the world's largest network of Advanced Technology and Intelligent Operations centers. Our 700,000+ people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners, and communities. At Accenture, we offer a dynamic and challenging environment for analytical and creative minds. We value a diverse workforce committed to achieving client goals and delivering results. Were built on a foundation of exceptional talent and are on the lookout for driven, energetic professionals to enhance our team. Key Responsibilities Data Handling Data Collection:Participate in calls with stakeholders (internal and external, based in the US) to gather data from various sources (email, Dropbox, Egnyte, databases). Data Audit:Lead assessment of data quality, identify gaps, and create summaries as per database requirements. Data Scrubbing:Assist with creating data cleansing rules and incorporate data clarifications provided by data source owners. Data Profiling:Assist with creating multi-dimensional data views, data analysis reports, and extracts. Data Classification Spend Classification:Analyze procurement spend using several techniques to comprehensively classify into a custom taxonomy in Accenture's spend analytics tool. Enhancements:Diligently incorporate feedback and make recommendations for process improvement. Report Generation:Create specific and opportunity spend-assessment reports/templates. Periodic Refreshes:Lead discussions with US-based stakeholders for data gathering, data quality checks, control total validation, and spend classification. Advanced Analytics and AI/ML Develop custom data models and algorithms to apply to data sets. Use machine learning tools and statistical techniques to produce solutions to problems. Implement clustering and auto classification using predictive and supervised learning techniques. Design and implement complex data models from scratch. Develop and optimize ETL processes to ensure efficient data handling and processing. Create intuitive and effective front-end interfaces from scratch. Apply AI/ML techniques to optimize supply chain management, including demand forecasting, inventory optimization, and supplier performance analysis. Utilize advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Develop and implement AI/ML models for predictive analytics and automated decision-making in supply chain operations. Industry Research Secondary Research:Conduct market research to create company and industry primers using online secondary data or information sources. Technical Requirements Python:Hands-on experience with threading limitations and multi-process architecture. MySQL:Ability to integrate multiple data sources using MySQL. Strong coding knowledge and experience with several languages (e.g., R, SQL, JavaScript, Java, CSS, C++). Familiarity with statistical and data mining techniques (e.g., GLM/Regression, Random Forest, Boosting, Trees, text mining, social network analysis). Experience with advanced machine learning algorithms and statistics:regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Collaboration and Communication Coordinate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. Excellent spoken and written English communication skills, with the ability to participate in global team calls. Other Important Details Work Location:Gurugram, India Expected Start Dates:July 2024 Qualifications Who Should Apply? Work Experience:6-8 years of relevant experience in data modeling, ETL automation, AI/ML, and front-end design. Academic Qualifications:Bachelor’s or Master’s degree in Engineering, Mathematics, Computer Science, or a related field. Job Requirements: - Extensive experience in handling and classifying spend data using AI/ML Techniques. - Strong leadership and team management skills. - Proficiency in MS Excel and MS PowerPoint. - High attention to detail, accuracy, and innovative problem-solving skills. - Preferred experience in supply chain management, with a focus on applying AI/ML techniques to optimize operations.

Posted 3 months ago

Apply

4 - 8 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

ETL Testing Location- Hyderabad Grade- B2/C1 MOI- L1 -Virtual, L2- F2F Key Responsibilities:Candidate should have experience in ETL Automation and able to understand all DB complex queries and Joins and able to Automate. He should have strong knowledge on JAVA ,OOPSconcepts and Selenium skills. Familiar with Selenium Frameworks/ETL. Knowledge on CI/CD pipelines to streamline deployment processes.Understanding the Requirements, Good Knowledge on Informatica PowerCenter/ TOSCA .Good Knowledge on identifying Test scenarios and Good Knowledge in Writing testcases, Good Knowledge in SQL , Oracle , Unix and Winscp. Good knowledge on DWH Concepts. Validating Source data accuracy , completeness and correctness. Must have Good Communication skills. Need to Work in a team and as an individual. Good understanding on Defect Life cycle.

Posted 3 months ago

Apply

3 - 8 years

15 - 25 Lacs

Chennai, Bengaluru

Hybrid

Naukri logo

Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring ETL AUTOMATION TESTING for one of our leading MNC client. PFB the details for your better understanding: ~~~~ LOOKING FOR IMMEDIATE JOINERS ~~~~ WORK LOCATION: CHENNAI & PUNE Job Role: Data Quality Engineer EXPERIENCE: 3 Yrs - 8Yrs CTC Range: 15LPA -25LPA Work Type: Hybrid Only Technical Skills: MUST HAVE : Python /Java coding (for Data Analysis) - Programming language Strong in Advanced /Complex SQL queries. Good to have Exp in Selenium, Cucumber/ BDD (Training can be provided) Working knowledge of CI/CD ((Training can be provided) Hands on Functional Testing Experience. (Training can be provided) If interested, kindly APPLY for IMMEDIATE response. Thanks & Rgds Shobana | GSN | shobana@gsnhr.net | Google Reviews: https://g.co/kgs/UAsF9W

Posted 3 months ago

Apply

1 - 5 years

7 - 15 Lacs

Bengaluru

Hybrid

Naukri logo

Job Title: Data Automation Engineer Location: Bangalore Corporate Title: Analyst / Sr. Analyst (NCT) Skills: Expertise in coding/ programming in Python, automation process, VBA, and SQL, ETL Automation, Good to have exposure to automation tools (Blue prism, AA), process automation Overview KYC Operations play an integral part in the firms first line of defense against financial crime, reducing the risk of working with new clients (primarily Know Your Customer (KYC) risk), whilst ensuring client relationships are on-boarded and maintained efficiently. KYC Operations provide a golden source of quality reference data for CIB, underpinning the firms key Regulatory, Control & Governance standards. Within KYC Operations there is a dedicated global group KYC Transformation that drives end-to-end-delivery. Our teams partners with stakeholders in and outside of KYC Ops to ensure our processes are fit for purpose, follow a uniform standard and continuously improving our processes thereby adding a quantifiable value to support colleagues and clients in a flexible, fast and focused manner. As a Data Automation Engineer, you will build fast solutions to help Operations and other parts of the bank deliver their highest value, removing repetitive tasks, building strategic data pipelines, ensuring automation is robust and stable using solutions incl. Python, VBA, MS Power platforms (Power Automate, Power Apps, Power BI), SQL and Share Points. Our approach is to ensure the solution can be merged into strategic tooling and fits the technology design process standards. Your key responsibilities Work with stakeholders to identify opportunities to drive business solutions and improvements Automate manual effort, providing tactical solutions to improve speed and value. Work in an agile way to deliver proof of concept and fast solutions using the appropriate technologies appropriate to the problem statements and requirements Enhance personal and team network to ensure cooperation yields efficiencies, for example sharing of solutions to a wider team, re-using existing solutions, enhancing solutions to have a wider and more beneficial business impact Your Skills & Experience Analyse, design, develop, test, deploy and support Digital services software solutions Exposure to ETL technologies and methods Expertise in coding/ programming in Python, VBA, and SQL skills to extract data sets efficiently Experience in developing business solutions in any of MS power Apps, MS Power Automate or RPA Excellent spatial reasoning and ability to see view process and data in two or three-dimensions. Process Mapping, Process Re-engineering & Data orientated with experience in enterprise process modelling for current and future state. The ability to generate innovative ideas and deliver effectively, highlighting blockers if needed. Exposure to workflow solutions, Alteryx, Pentaho, Celonis, linux and database tuning are desirable Documenting solutions (i.e., Creation and upkeep of artefacts - Requirement Docs, SDDs, Test Scripts, JIRA tickets, KSD - post go live) Provide L1 support to the existing RPA solution, resolve the issues with minimum TAT to ensure business resiliency Competencies: Work alongside Solutions Architects, Business Analysts and BOT controlling to contribute with solution designs Highly organized with a keen eye for detail and proven record operating in a fast- paced environment Ability to work independently and as part of the team with an enterprising spirit and a passion to learn and apply new technologies. Excellent communication skills with ability to converse clearly with stakeholders from all cultures Ability to work well in a global and virtual team, under pressure and multi-task How we’ll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies