Home
Jobs

13457 Etl Jobs - Page 7

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Jole Overview We are seeking a highly skilled and motivated Data Engineer || Gurgaon with 2+ years of experience. If you're passionate about coding, problem-solving, and innovation, we'd love to hear from you! About Us CodeVyasa is a mid-sized product engineering company that works with top-tier product/solutions companies such as McKinsey, Walmart, RazorPay, Swiggy , and others. We are about 550+ people strong and we cater to Product & Data Engineering use-cases around Agentic AI, RPA, Full-stack and various other GenAI areas. Key Responsibilities: Design, build, and manage scalable data pipelines using Azure Data Factory and PySpark . Lead data warehousing and lakehouse architecture initiatives to support advanced analytics and BI use cases. Collaborate with stakeholders to understand data requirements and translate them into technical solutions. Build and maintain insightful dashboards and reports using Power BI . Mentor junior team members and provide technical leadership across data projects. Ensure best practices in data governance, quality, and security. Must-Have Skills: 2–7 years of experience in data engineering and analytics. Strong hands-on experience with Azure Data Factory , PySpark , and Power BI . Deep understanding of Data Warehousing concepts and Data Lakehouse architecture. Proficient in data modeling, ETL/ELT processes, and performance tuning. Strong problem-solving and communication skills. Why Join CodeVyasa? Work on innovative, high-impact projects with a team of top-tier professionals. Continuous learning opportunities and professional growth. Flexible work environment with a supportive company culture. Competitive salary and comprehensive benefits package. Free healthcare coverage. Here's a glimpse of what life at CodeVyasa looks like Life at CodeVyasa. Show more Show less

Posted 17 hours ago

Apply

2.0 - 5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Be a part of India’s largest and most admired news network! Network18 is India's most diversified Media Company in the fast growing Media market. The Company has a strong Heritage and we possess a strong presence in Magazines, Television and Internet domains. Our brands like CNBC, Forbes and Moneycontrol are market leaders in their respective segments. The Company has over 7,000 employees across all major cities in India and has been consistently managed to stay ahead of the growth curve of the industry. Network 18 brings together employees from varied backgrounds under one roof united by the hunger to create immersive content and ideas. We take pride in our people, who we believe are the key to realizing the organization’s potential. We continually strive to enable our employees to realize their own goals, by providing opportunities to learn, share and grow. Role Overview: We are seeking a passionate and skilled Data Scientist with over a year of experience to join our dynamic team. You will be instrumental in developing and deploying machine learning models, building robust data pipelines, and translating complex data into actionable insights. This role offers the opportunity to work on cutting-edge projects involving NLP, Generative AI, data automation, and cloud technologies to drive business value. Key Responsibilities: Design, develop, and deploy machine learning models, with a strong focus on NLP (including advanced techniques and Generative AI) and other AI applications. Build, maintain, and optimize ETL pipelines for automated data ingestion, transformation, and standardization from various sources Work extensively with SQL for data extraction, manipulation, and analysis in environments like BigQuery. Develop solutions using Python and relevant data science/ML libraries (Pandas, NumPy, Hugging Face Transformers, etc.). Utilize Google Cloud Platform (GCP) services for data storage, processing, and model deployment. Create and maintain interactive dashboards and reporting tools (e.g., Power BI) to present insights to stakeholders. Apply basic Docker concepts for containerization and deployment of applications. Collaborate with cross-functional teams to understand business requirements and deliver data-driven solutions. Stay abreast of the latest advancements in AI/ML and NLP best practices. Required Qualifications & Skills: 2 to 5 years of hands-on experience as a Data Scientist or in a similar role. Solid understanding of machine learning fundamentals, algorithms, and best practices. Proficiency in Python and relevant data science libraries. Good SQL skills for complex querying and data manipulation. Demonstrable experience with Natural Language Processing (NLP) techniques, including advanced models (e.g., transformers) and familiarity with Generative AI concepts and applications. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications & Skills: Familiarity and hands-on experience with Google Cloud Platform (GCP) services, especially BigQuery, Cloud Functions, and Vertex AI. Basic understanding of Docker and containerization for deploying applications. Experience with dashboarding tools like Power BI and building web applications with Streamlit. Experience with web scraping tools and techniques (e.g., BeautifulSoup, Scrapy, Selenium). Knowledge of data warehousing concepts and schema design. Experience in designing and building ETL pipelines. Disclaimer: Please note Network18 and related group companies do not use the services of vendors or agents for recruitment. Please beware of such agents or vendors providing assistance. Network18 will not be responsible for any losses incurred. “We correspond only from our official email address” Show more Show less

Posted 17 hours ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greeting from Infosys BPM Ltd., Exclusive Women's Walkin drive We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 20th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 20th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less

Posted 17 hours ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

Remote

Linkedin logo

Role : Database Engineer Location : Remote Notice Period : 30 Days Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Show more Show less

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role: AWS Data Engineer JOB LOCATION : Chennai, Indore , Pune EXPERIENCE REQUIREMENT : 5+ Required Technical Skill: Strong Knowledge of Aws Glue/AWS REDSHIFT/SQL/ETL. Good knowledge and experience in Pyspark for forming complex Transformation logic. AWS Data Engineer, SQL,ETL, DWH , Secondary : AWS Glue , Airflow Must-Have Good Knowledge of SQL , ETL A minimum of 3 + years' experience and understanding of Python core concepts, and implementing data pipeline frameworks using PySpark and AWS. Work well independently as well as within a team Good knowledge in working with various AWS services including S3, Glue, DMS, Redshift. Proactive, organized, excellent analytical and problem-solving skills Flexible and willing to learn, can-do attitude is key Strong verbal and written communication skills Good-to-Have Good knowledge of SQL ,ETL ,understanding of Python core concepts, and implementing data pipeline frameworks using PySpark and AWS. Good knowledge in working with various AWS services including S3, Glue, DMS, Redshift Responsibility: AWS Data Engineer Pyspark / Python / SQL / ETL A minimum of 3 + years' experience and understanding of Python core concepts, and implementing data pipeline frameworks using PySpark and AWS Good knowledge of SQL, ETL and also working with various AWS services including S3, Glue, DMS, Redshift Show more Show less

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Company Name: Nectarbits Pvt Ltd Location: Gota, Ahmedabad Experience: 5+ years Job Description Responsible for the successful delivery and closure of multiple projects. Facilitates team activities, including daily stand-up meetings, grooming, sprint planning, demonstrations, release planning, and team retrospectives. Ensure team is aware of tasks and delivery dates. Developing project scopes and objectives, involving all relevant stakeholders and ensuring technical feasibility. To identify scope creep to ensure projects are on track, as well as judge commercial viability and actionable steps. Led sprint planning sessions and periodic conference calls with clients and senior team members to agree on the prioritization of projects and tasks. Be a central point of contact, and responsible for the projects handled and provide transparency & collaboration with different teams To represent the teams needs and requirements to the client to ensure timelines and quality delivery are practically achievable. Build a trusting and safe environment where problems can be raised and resolved. Understanding clients business and processes to provide effective solutions as a technology consultant. Report and escalate to management as needed. Quick learner and implementor of learning path for the Have : Must have hands-on development experience in Qa Automation & managing large-scale projects. Must have experience in managing new development projects with at least 8 to 10 people team with a duration of 6+ months (excluding ongoing support and maintenance projects/tasks), developing the project & release plan, adhering to the standard processes of the organization. Excellent verbal, and written communication skills with both technical and non-technical customers Strong understanding of architecture, design, and implementation of technical solutions. Extremely fluent in REST/SOAP APIs with JSON/XML. Experience in ETL is a plus. A good understanding of N-tier and Microservice architecture. Well-versed in Agile development methodology, and all its ceremonies. Excellent problem-solving/troubleshooting skills, particularly about anticipating and solving problems, issues, risks, or concerns before they become critical Prepare a clear and effective communications plan, and ensure proactive communication of all relevant information to the customer and to all stakeholders Experience in creating Wireframes and/or Presentation to effectively convey technology solutions to To Have : Assess and work with the sales team to create and review proposals, and contracts delivered to determine a proper project plan Show more Show less

Posted 17 hours ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Bangalore & Gurugram YOE - 7+ years We are seeking a talented Data Engineer with strong expertise in Databricks, specifically in Unity Catalog, PySpark, and SQL, to join our data team. You’ll play a key role in building secure, scalable data pipelines and implementing robust data governance strategies using Unity Catalog. Key Responsibilities: Design and implement ETL/ELT pipelines using Databricks and PySpark. Work with Unity Catalog to manage data governance, access controls, lineage, and auditing across data assets. Develop high-performance SQL queries and optimize Spark jobs. Collaborate with data scientists, analysts, and business stakeholders to understand data needs. Ensure data quality and compliance across all stages of the data lifecycle. Implement best practices for data security and lineage within the Databricks ecosystem. Participate in CI/CD, version control, and testing practices for data pipelines Required Skills: Proven experience with Databricks and Unity Catalog (data permissions, lineage, audits). Strong hands-on skills with PySpark and Spark SQL. Solid experience writing and optimizing complex SQL queries. Familiarity with Delta Lake, data lakehouse architecture, and data partitioning. Experience with cloud platforms like Azure or AWS. Understanding of data governance, RBAC, and data security standards. Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with tools like Airflow, Git, Azure Data Factory, or dbt. Exposure to streaming data and real-time processing. Knowledge of DevOps practices for data engineering. Interested candidate can submit details at https://forms.office.com/r/g2h52X7Bt9 Show more Show less

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Karnataka, India

On-site

Linkedin logo

Job Title: Senior Product Manager Location: Onsite - Bangalore, India Shift Timing: UK Shift (01:30 PM IST - 10:30 PM IST) About the Role: We are looking for a Senior Product Manager to take ownership of inbound product management for key enterprise SaaS offerings. This role involves driving product strategy, defining and managing requirements, coordinating with engineering and cross-functional teams, and ensuring timely, customer-focused releases. If you are passionate about solving real-world problems through innovative product development in the cloud and data integration space, we'd love to connect. Roles & Responsibilities: Lead the execution of the product roadmap, including go-to-market planning, product enhancements, and launch initiatives Translate market needs and customer feedback into detailed product requirements and specifications Conduct competitive analysis, assess industry trends, and define effective product positioning and pricing Collaborate with engineering teams to deliver high-quality solutions within defined timelines Create and maintain Product Requirement Documents (PRDs), functional specs, use cases, and internal presentation materials Evaluate build vs. buy options and engage in strategic partnerships to deliver comprehensive solutions Work closely with marketing to build sales enablement tools—product datasheets, pitch decks, whitepapers, and more Act as a domain expert by providing product training to internal teams such as sales, support, and services Join client interactions (calls and demos) to gather insights, validate solutions, and support adoption Ensure alignment between product vision, business goals, and technical feasibility throughout development cycles Skills & Qualifications: Minimum 5+ years of experience in product management for SaaS or enterprise software products Proven track record in delivering inbound-focused product strategy and leading full product lifecycles Experience with data integration, ETL, or cloud-based data platforms is highly desirable Strong working knowledge of cloud platforms like AWS, GCP, Azure, or Snowflake Familiarity with multi-tenant SaaS architectures and tools like Salesforce, NetSuite, etc Demonstrated ability to work in Agile environments with distributed development teams Exceptional analytical, communication, and stakeholder management skills Ability to prioritize effectively in fast-paced, evolving environments Bachelor's degree in Computer Science, Business Administration, or a related field. MBA preferred Experience in working with international teams or global product rollouts is a plus Show more Show less

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Test Engineer Location: Hyderabad (Onsite) Experience Required: 5 Years Job Description: We are looking for a detail-oriented and skilled Test Engineer with 5 years of experience in testing SAS applications and data pipelines . The ideal candidate should have a solid background in SAS programming , data validation , and test automation within enterprise data environments. Key Responsibilities: Conduct end-to-end testing of SAS applications and data pipelines to ensure accuracy and performance. Write and execute test cases/scripts using Base SAS, Macros, and SQL . Perform SQL query validation and data reconciliation using industry-standard practices. Validate ETL pipelines developed using tools like Talend, IBM Data Replicator , and Qlik Replicate . Conduct data integration testing with Snowflake and use explicit pass-through SQL to ensure integrity across platforms. Utilize test automation frameworks using Selenium, Python, or Shell scripting to increase test coverage and reduce manual efforts. Identify, document, and track bugs through resolution, ensuring high-quality deliverables. Required Skills: Strong experience in SAS programming (Base SAS, Macro) . Expertise in writing and validating SQL queries . Working knowledge of data testing frameworks and reconciliation tools . Experience with Snowflake and ETL validation tools like Talend, IBM Data Replicator, Qlik Replicate. Proficiency in test automation using Selenium , Python , or Shell scripts . Solid understanding of data pipelines and data integration testing practices. Show more Show less

Posted 18 hours ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Greetings from Tata Consulting Services TCS is Hiring for Python Developer Experience : 5-10 years Location: Pune/Hyderabad/Bangalore/Chennai/Kochi/Bhubaneswar Please find the JD below Required Technical Skill - ETL Development experience. Must Have - 4+ years of experience in Python Show more Show less

Posted 18 hours ago

Apply

8.0 years

0 Lacs

India

Remote

Linkedin logo

Job title : Data Engineer Experience: 5–8 Years Location: Remote Shift: IST (Indian Standard Time) Contract Type: Short-Term Contract Job Overview We are seeking an experienced Data Engineer with deep expertise in Microsoft Fabric to join our team on a short-term contract basis. You will play a pivotal role in designing and building scalable data solutions and enabling business insights in a modern cloud-first environment. The ideal candidate will have a passion for data architecture, strong hands-on technical skills, and the ability to translate business needs into robust technical solutions. Key Responsibilities Design and implement end-to-end data pipelines using Microsoft Fabric components (Data Factory, Dataflows Gen2). Build and maintain data models , semantic layers , and data marts for reporting and analytics. Develop and optimize SQL-based ETL processes integrating structured and unstructured data sources. Collaborate with BI teams to create effective Power BI datasets , dashboards, and reports. Ensure robust data integration across various platforms (on-premises and cloud). Implement mechanisms for data quality , validation, and error handling. Translate business requirements into scalable and maintainable technical solutions. Optimize data pipelines for performance and cost-efficiency . Provide technical mentorship to junior data engineers as needed. Required Skills Hands-on experience with Microsoft Fabric : Dataflows Gen2, Pipelines, OneLake. Strong proficiency in Power BI , including semantic modeling and dashboard/report creation. Deep understanding of data modeling techniques: star schema, snowflake schema, normalization, denormalization. Expertise in SQL , stored procedures, and query performance tuning. Experience integrating data from diverse sources: APIs, flat files, databases, and streaming. Knowledge of data governance , lineage, and data catalog tools within the Microsoft ecosystem. Strong problem-solving skills and ability to manage large-scale data workflows. Show more Show less

Posted 18 hours ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Summary: We are looking for a skilled ETL Tester with hands-on experience in validating data pipelines and data transformations in an AWS-based ecosystem . The ideal candidate should have a strong background in ETL testing, a solid understanding of data warehousing concepts, and proficiency with tools and services in AWS like S3, Redshift, Glue, Athena, and Lambda. Key Responsibilities: Design and execute ETL test cases for data ingestion, transformation, and loading processes. Perform data validation and reconciliation across source systems, staging, and target layers (e.g., S3, Redshift, RDS). Understand data mappings and business rules; write SQL queries to validate transformation logic. Conduct end-to-end testing including functional, regression, and performance testing of ETL jobs. Work closely with developers, data engineers, and business analysts to identify and troubleshoot defects . Validate data pipelines orchestrated through AWS Glue, Step Functions, and Lambda functions . Utilize Athena and Redshift Spectrum for testing data stored in S3. Collaborate using tools like JIRA, Confluence, Git, and CI/CD pipelines . Prepare detailed test documentation including test plans, test cases, and test summary reports. Required Skills: 3–4 years of experience in ETL/Data Warehouse testing . Strong SQL skills for data validation across large datasets. Working knowledge of AWS services such as S3, Redshift, Glue, Athena, Lambda, CloudWatch. Experience testing batch and streaming data pipelines . Familiarity with Python or PySpark is a plus for data transformation or test automation. Experience in using ETL tools (e.g., Informatica, Talend, or AWS Glue ETL scripts). Knowledge of Agile/Scrum methodology . Understanding of data quality frameworks and test automation practices . Good to Have: Exposure to BI tools like QuickSight, Tableau, or Power BI. Basic understanding of data lake and data lakehouse architectures . Experience in working with JSON, Parquet , and other semi-structured data formats. Show more Show less

Posted 18 hours ago

Apply

3.5 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Hiring Across all the levels . Location- Pune Preferred/ Gurgaon Mandatory Skillset Validated systems testing expertise Validated Systems Testing and end to end test management expertise + Strong client engagement Validated Systems Testing and end to end test management expertise Responsibilities: Contribute to project’s overall Computer System Validation deliverables. Create and execute test scenarios; select the best testing methodologies, techniques and evaluation criteria for testing. Draft, review and approve validation deliverables such as user requirements, technical design specifications, IQ/OQ/PQ scripts and reports, error logs, configuration document, traceability matrix and document 21 CFR Part 11 and EU Annex 11 compliance. Build automation scripts and help the leads in the designing/configuring the test automation frameworks. Understand the various testing activities: Unit, System, Component, User Acceptance, Performance, Integration and Regression. Participate in user story mapping, sprint planning, estimation, and feature walk-throughs. Experience in leading junior test analysts, assigning and tracking tasks provided to team members. Manage end to end testing / validation lifecycle on applications likes - Solution Manager, JIRA, and HP ALM (desired). Self - motivated, team-oriented individual with strong problem-solving abilities Evaluate risks, assess closure requirements, and process change controls for computerized systems. Define key test processes, best practices, KPIs, collateral. Well verse with ETL or automation testing. Qualifications: Bachelor's/master’s degree in engineering, Science, Medical or related field. Hands-on experience in Computer System Validation of Applications and Infrastructure Qualification. A minimum of 3.5-11 years of experience in computer systems validation and hands on experience within GxP (GCP/GMP regulated environment (FDA, EU, MHRA)). Experience in managing and leading testing related activities. Experience in creating test scenarios and testing protocols (IQ/OQ/PQ/UAT) within the various SDLC or Agile phases as per the standard GxP protocol. In-depth understanding of defect management processes. Strong SQL skills for data validation and development of expected results. Hands on Test Management Tool – JIRA and HP ALM. Experience in defining risk-based strategies for validation of computerized systems and author review end-to-end CSV documentation in accordance with various test plans. Understanding CSV and project related SOP’s and WI’s. Well versed in Good Documentation Practices. Show more Show less

Posted 18 hours ago

Apply

0.0 - 6.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

Experience- 6+ years Work Mode- Hybrid Job Summary: We are seeking a skilled Informatica ETL Developer with 5+ years of experience in ETL and Business Intelligence projects. The ideal candidate will have a strong background in Informatica PowerCenter , a solid understanding of data warehousing concepts , and hands-on experience in SQL, performance tuning , and production support . This role involves designing and maintaining robust ETL pipelines to support digital transformation initiatives for clients in manufacturing, automotive, transportation, and engineering domains. Key Responsibilities: Design, develop, and maintain ETL workflows using Informatica PowerCenter . Troubleshoot and optimize ETL jobs for performance and reliability. Analyze complex data sets and write advanced SQL queries for data validation and transformation. Collaborate with data architects and business analysts to implement data warehousing solutions . Apply SDLC methodologies throughout the ETL development lifecycle. Support production environments by identifying and resolving data and performance issues. Work with Unix shell scripting for job automation and scheduling. Contribute to the design of technical architectures that support digital transformation. Required Skills: 3–5 years of hands-on experience with Informatica PowerCenter . Proficiency in SQL and familiarity with NoSQL platforms . Experience in ETL performance tuning and troubleshooting . Solid understanding of Unix/Linux environments and scripting. Excellent verbal and written communication skills. Preferred Qualifications: AWS Certification or experience with cloud-based data integration is a plus. Exposure to data modeling and data governance practices. Job Type: Full-time Pay: From ₹1,000,000.00 per year Location Type: In-person Schedule: Monday to Friday Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your current CTC? What is your expected CTC? What is your current location? What is your notice period/ LWD? Are you comfortable attending L2 F2F interview in Hyderabad? Experience: Informatica powercenter: 5 years (Required) total work: 6 years (Required) Work Location: In person

Posted 18 hours ago

Apply

15.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Title : Technical Architect - Data Governance & MDM Experience: 15+ years Location: Mumbai/Pune/Bangalore Role Overview: The Technical Architect specializing in Data Governance and Master Data Management (MDM) designs, implements, and optimizes enterprise data solutions. The jobholder has expertise in tools like Collibra, Informatica, InfoSphere, Reltio, and other MDM platforms, ensuring data quality, compliance, and governance across the organization. Responsibilities: Architect and optimize strategies for data quality, metadata management, and data stewardship. Design and implement data governance frameworks and MDM solutions using tools like Collibra, Informatica, InfoSphere, and Reltio. Develop strategies for data quality, metadata management, and data stewardship. Collaborate with cross-functional teams to integrate MDM solutions with existing systems. Establish best practices for data governance, security, and compliance. Monitor and troubleshoot MDM environments for performance and reliability. Provide technical leadership and guidance to data teams. Stay updated on advancements in data governance and MDM technologies. Key Technical Skills. 10+ years of experience working on DG/MDM projects Strong on Data Governance concepts Hands-on different DG tools/services Hands-on Reference Data, Taxonomy Strong understanding of Data Governance, Data Quality, Data Profiling, Data Standards, Regulations, Security Match and Merge strategy Design and implement the MDM Architecture and Data Models Usage of Spark capabilities Statistics to deduce meanings from vast enterprise level data Different data visualization means of analyzing huge data sets Good to have knowledge of Python/R/Scala languages Experience on DG on-premise and on-cloud Understanding of MDM, Customer, Product, Vendor Domains and related artifacts Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Technology Stack - Collibra, IBM MDM, Reltio, Infosphere Eligibility Criteria 15+ years of total experience. Bachelor’s degree in Computer Science, Data Management, or a related field. Proven experience as a Technical Architect in Data Governance and MDM. Certifications in relevant MDM tools (e.g., Collibra Data Governance, Informatica / InfoSphere / Reltio MDM, ). Experience with cloud platforms like AWS, Azure, or GCP. Proficiency in tools like Collibra, Informatica, InfoSphere, Reltio, and similar platforms. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Interested candidates can apply directly. Alternatively, you can also send your resume to ansari.m@atos.net Show more Show less

Posted 18 hours ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

Linkedin logo

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : PLSQL,SQL Writing,mSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Preferred Skills (Nice to Have) Experience with ETL processes and data warehousing. Knowledge of cloud-based databases (AWS RDS, Google BigQuery, Azure SQL). Familiarity with database performance tuning and indexing strategies. Exposure to Python or other scripting languages for database automation. Experience with business intelligence (BI) tools like Power BI or Tableau. What We Offer Fully remote internship with flexible working hours. Hands-on experience with real-world database projects. Mentorship from experienced database professionals. Certificate of completion and potential for a full-time opportunity based on performance. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 18 hours ago

Apply

15.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We are a Rakuten Group company, providing global B2B services for the mobile telco industry and enabling next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are now taking our mobile offering global! To support our ambitions to provide an innovative cloud-native telco platform for our customers, we are looking to recruit and develop top talent from Digital Product Management. Let’s build the future of mobile telecommunications together! We are a Rakuten Group company, providing global B2B/B2C services for the mobile telco industry and enabling next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are now taking our mobile offering global! To support our ambitions to provide an innovative cloud-native telco platform for our customers, we are looking to recruit and develop top talent from Digital Product Management. Let’s build the future of mobile telecommunications together! Role : Technical Program manager You will independently lead cross-organisation programs, influencing the roadmap priorities and technical direction across teams. You will work with stakeholders across the organisation and own the communication of all aspects of the program including surfacing risks and progress towards the goal. You will guide the team towards technical solutions and make trade-off decisions. You will drive program management best practices across the organisation. The role requires closely working with the multiple functional teams (including but not limited to Business, Architects, Engineering, Operation support etc ) in building and maintaining program delivery timelines, unblocking teams, defining, and streamlining cross-functional dependencies along with increasing efficiency and velocity of project execution. You would likely spend most of the days in Agile, Kanban, or other project planning tools and scheduling meetings with relevant stakeholders to make sure projects keep moving forward to deliver a program execution strategy and timeline, as well as regular reporting of project health to stakeholders throughout a project’s life cycle. Team : RBSS Delivery organization Skills and Qualification Upto 15 years of hands-on technical project/program management experience with at least 10+ years of program managing /working in Scrums Must have Telecom Background with exposure on working with Telcom operators / ISP ( B2B, B2C customer solutions ) in software delivery / integration for at least 5+ years in BSS domain. Technology stack : Managed complex Data migration projects involving technologies such as Cloud ( AWS, GCP or compatible ), Microservices, Various DB solution (Oracle, MySQL, Couchbase, Elastic DB, Camunda etc ) ,Data streaming technologies ( such as Kafka) and tools associated with the technology stack Excellent Knowledge of Project Management Methodology and Software Development Life Cycles including Agile with excellent client-facing and internal communication skills. Ability to plan, organize, prioritize, and deliver multiple projects simultaneously. In-depth-knowledge and understanding of Telecom BSS business needs with the ability to establish/maintain high level of customer trust and confidence with Solid organizational skills including attention to detail and multitasking skills. Good to understanding of the challenges associated with BSS business and understanding of high level modules( CRM, Order Management , Revenue mgmt. and Billing services ) Excellent verbal, written, and presentation skills to effectively communicate complex technical and business issues (and solutions) to diverse audiences Strong analytical, planning, and organizational skills with an ability to manage competing demands Always curious about various issues/items. Have passion to learn continuously in a fast- moving environment Strong working knowledge of Microsoft Office, Confluence, JIRA, etc. Good to have: Project Management Professional (PMP) / Certified Scrum Master certification Good to have: knowledge of external solutions integrated with ETL software, Billing, Warehouse/supply chain related migrations projects Key job responsibilities Manage/Streamline the program planning by evaluating the incoming project demand across multiple channels against available capacity Regularly define and review KPI ‘s for proactively seek out new and improved mechanisms for visibility ensuring your program stays aligned with organization objectives Develop and Maintain Kanban boards /workstream dashboards Work with stakeholders during entire life cycle of the program, Execute Project requirements, Prepare detailed project plan, identify risks, manage vendor / vendor resources, measure program metrics and take corrective and preventive actions Ability to adopt Agile best practices ( such as estimation techniques) and define and optimize the processes is essential Coordinate with the product Management team to Plan Features and Stories into sprints, understand business priorities, align required stakeholders to make sure the team is able to deliver the expected outcome Manage Technology Improvements and other enhancements from conceptualization to delivery, have deep understanding of their impact, pros/cons, work through required detail, collaborate with all stakeholders till its successfully deployed in production Manage and Deliver Planned RBSS releases by working with customers .Work with Scrum masters, plan Scrum capacity, manage productivity of the teams Monitoring progress of the software developed by scrum teams, quality of the deliverables Working with engineering & product teams to scope product delivery, define solution strategies and understand development alternatives, as well as support Ensure availability to the team to answer questions and deliver direction. Work across multiple teams and vendors (cross-cutting across programs, business/engineering teams, and/or technologies) to drive delivery strategy & dependency management ensuring active delivery and pro-active communications Forecast and manage infrastructure and Resourcing demand against the operational growth of the platform in collaboration with engineering teams Delivering Agile projects that offer outstanding business value to the users. Supporting the stakeholders in implementing an effective project governance system. “Rakuten is committed to cultivating and preserving a culture of inclusion and connectedness. We are able to grow and learn better together with a diverse team and inclusive workforce. The collective sum of the individual differences, life experiences, knowledge, innovation, self-expression, and talent that our employees invest in their work represents not only part of our culture, but our reputation and Rakuten’s achievement as well. In recruiting for our team, we welcome the unique contributions that you can bring in terms of their education, opinions, culture, ethnicity, race, sex, gender identity and expression, nation of origin, age, languages spoken, veteran’s status, color, religion, disability, sexual orientation and beliefs” Show more Show less

Posted 18 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: QA Tester Data Job Type: Full-time Location: On-site - Hyderabad, Pune or New Delhi Job Summary: Join our customer’s team as a dedicated ETL Tester where your expertise will drive the quality and reliability of crucial business data solutions. As an integral part of our testing group, you will focus on ETL Testing while engaging in automation, API, and MDM testing to support robust, end-to-end data validation and integration. We value professionals who demonstrate strong written and verbal communication and a passion for delivering high-quality solutions. Key Responsibilities: Design, develop, and execute comprehensive ETL test cases, scenarios, and scripts to validate data extraction, transformation, and loading processes. Collaborate with data engineers, business analysts, and QA peers to clarify requirements and ensure accurate data mapping, lineage, and transformations. Perform functional, automation, API, and MDM testing to support a holistic approach to quality assurance. Utilize tools such as Selenium to drive automation efforts for repeatable and scalable ETL testing processes. Identify, document, and track defects while proactively communicating risks and issues to stakeholders with clarity and detail. Work on continuous improvement initiatives to enhance test coverage, efficiency, and effectiveness within the ETL testing framework. Create and maintain detailed documentation for test processes and outcomes, supporting both internal knowledge sharing and compliance requirements. Required Skills and Qualifications: Strong hands-on experience in ETL testing, including understanding of ETL tools and processes. Proficiency in automation testing using Selenium or similar frameworks. Experience in API testing, functional testing, and MDM testing. Excellent written and verbal communication skills, with an ability to articulate technical concepts clearly to diverse audiences. Solid analytical and problem-solving abilities to troubleshoot data and process issues. Attention to detail and a commitment to high-quality deliverables. Ability to thrive in a collaborative, fast-paced team environment on-site at Hyderabad. Preferred Qualifications: Prior experience working in large-scale data environments or within MDM projects. Familiarity with data warehousing concepts, SQL, and data migration best practices. ISTQB or related QA/testing certification. Show more Show less

Posted 18 hours ago

Apply

5.0 years

0 Lacs

Kochi, Kerala, India

Remote

Linkedin logo

Job Description 🔹Position: Senior Data Analyst 📍Location: Trivandrum/Kochi / Remote 🕓 Experience: 5+ Years ⌛ Notice Period: Immediate Joiners Only 🛠 Mandatory Skills: SQL, Power BI, Python, Amazon Athena 🔎 Job Purpose We are seeking an experienced and analytical Senior Data Analyst to join our Data & Analytics team. The ideal candidate will have a strong background in data analysis, visualization, and stakeholder communication. You will be responsible for turning data into actionable insights that help shape strategic and operational decisions across the organization. 📍Job Description / Duties & Responsibilities Collaborate with business stakeholders to understand data needs and translate them into analytical requirements. Analyze large datasets to uncover trends, patterns, and actionable insights. Design and build dashboards and reports using Power BI. Perform ad-hoc analysis and develop data-driven narratives to support decision-making. Ensure data accuracy, consistency, and integrity through data validation and quality checks. Build and maintain SQL queries, views, and data models for reporting purposes. Communicate findings clearly through presentations, visualizations, and written summaries. Partner with data engineers and architects to improve data pipelines and architecture. Contribute to the definition of KPIs, metrics, and data governance standards. 📍Job Specification / Skills and Competencies Bachelor's or Master's degree in Statistics, Mathematics, Computer Science, Economics, or a related field. 5+ years of experience in a data analyst or business intelligence role. Advanced proficiency in SQL and experience working with relational databases (e.g. SQL Server, Redshift, Snowflake). Hands-on experience in Power BI. Proficiency in Python, Excel and data storytelling. Understanding of data modelling, ETL concepts, and basic data architecture. Strong analytical thinking and problem-solving skills. Excellent communication and stakeholder management skills To adhere to the Information Security Management policies and procedures. 📍Soft Skills Required Must be a good team player with good communication skills Must have good presentation skills Must be a pro-active problem solver and a leader by self Manage & nurture a team of data engineers Show more Show less

Posted 19 hours ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Title: Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less

Posted 19 hours ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Coupa makes margins multiply through its community-generated AI and industry-leading total spend management platform for businesses large and small. Coupa AI is informed by trillions of dollars of direct and indirect spend data across a global network of 10M+ buyers and suppliers. We empower you with the ability to predict, prescribe, and automate smarter, more profitable business decisions to improve operating margins. Why join Coupa? 🔹 Pioneering Technology: At Coupa, we're at the forefront of innovation, leveraging the latest technology to empower our customers with greater efficiency and visibility in their spend. 🔹 Collaborative Culture: We value collaboration and teamwork, and our culture is driven by transparency, openness, and a shared commitment to excellence. 🔹 Global Impact: Join a company where your work has a global, measurable impact on our clients, the business, and each other. Learn more on Life at Coupa blog and hear from our employees about their experiences working at Coupa. The Impact of a Lead Software Engineer – Data to Coupa: The Lead Data Engineer plays a critical role in shaping Coupa’s data infrastructure, driving the design and implementation of scalable, high-performance data solutions. Collaborating with teams across engineering, data science, and product, this role ensures the integrity, security, and efficiency of our data systems. Beyond technical execution, the Lead Data Engineer provides mentorship and defines best practices, supporting a culture of excellence. Their expertise will directly support Coupa’s ability to deliver innovative, data-driven solutions, enabling business growth and reinforcing our leadership in cloud-based spend management. What You’ll Do: Lead and drive the development and optimization of scalable data architectures and pipelines. Design and implement best-in-class ETL/ELT solutions for real-time and batch data processing. Optimize data analysis and computation for performance, reliability, and cost efficiency, implementing monitoring solutions to identify bottlenecks. Architect and maintain cloud-based data infrastructure leveraging AWS, Azure, or GCP services. Ensure data security and governance, enforcing compliance with industry standards and regulations. Develop and promote best practices for data modeling, processing, and analytics.Mentor and guide a team of data engineers, fostering a culture of innovation and technical excellence Collaborate with stakeholders, including Product, Engineering, and Data Science teams, to support data-driven decision-making Automate and streamline data ingestion, transformation, and analytics processes to enhance efficiency. Develop real-time and batch data processing solutions, integrating structured and unstructured data sources What you will bring to Coupa: We are looking for a candidate with 10+ years of experience in Data Engineering and Application development with at least 3+ years in a Technical Lead role. They must have a graduate degree in Computer Science or a related field of study. They must have experience with programming languages such as Python and Java. Expertise in Python is a must Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Expertise in processing and analyzing large data workloads. Experience in designing and implementing scalable Data Warehouse solutions to support analytical and reporting needs Experience with API development and design with REST or GraphQL. Experience building and optimizing 'big data' data pipelines, architectures, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency, and workload management Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Experience with big data tools: Spark, Kafka, etc. Experience with relational SQL and NoSQL databases. Experience with data pipeline and workflow management tools. Experience with AWS cloud services Coupa complies with relevant laws and regulations regarding equal opportunity and offers a welcoming and inclusive work environment. Decisions related to hiring, compensation, training, or evaluating performance are made fairly, and we provide equal employment opportunities to all qualified candidates and employees. Please be advised that inquiries or resumes from recruiters will not be accepted. By submitting your application, you acknowledge that you have read Coupa’s Privacy Policy and understand that Coupa receives/collects your application, including your personal data, for the purposes of managing Coupa's ongoing recruitment and placement activities, including for employment purposes in the event of a successful application and for notification of future job opportunities if you did not succeed the first time. You will find more details about how your application is processed, the purposes of processing, and how long we retain your application in our Privacy Policy. Show more Show less

Posted 19 hours ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greeting from Infosys BPM Ltd., We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 18th & 19th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 18th & 19th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less

Posted 19 hours ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greeting from Infosys BPM Ltd., Exclusive Women's Walkin drive We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 20th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 20th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less

Posted 19 hours ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Role Title: Data Scientist Location: India Worker Type: Full-Time Employee (FTE) Years of Experience: 8+ years Start Date: Within 2 weeks Engagement Type: Full-time Salary Range: Flexible Remote/Onsite: Hybrid (India-based candidates) Job Overview: We are looking for an experienced Data Scientist to join our team and contribute to developing AIdriven data conversion tools. You will work closely with engineers and business stakeholders to build intelligent systems for data mapping, validation, and transformation. Required Skills and Experience: • Bachelor’s or Master’s in Data Science, Computer Science, AI, or a related field • Strong programming skills in Python and SQL • Experience with ML frameworks like TensorFlow or PyTorch • Solid understanding of AI-based data mapping, code generation, and validation • Familiarity with databases like SQL Server and MongoDB • Excellent collaboration, problem-solving, and communication skills • At least 8 years of relevant experience in Data Science • Open mindset with a willingness to experiment and learn from failures Preferred Qualifications: • Experience in the financial services domain • Certifications in Data Science or AI/ML • Background in data wrangling, ETL, or master data management • Exposure to DevOps tools like Jira, Confluence, BitBucket • Knowledge of cloud and AI/ML tools like Azure Synapse, Azure ML, Cognitive Services, and Databricks • Prior experience delivering AI solutions for data conversion or transformation Show more Show less

Posted 19 hours ago

Apply

5.0 - 6.0 years

0 Lacs

Thane, Maharashtra, India

On-site

Linkedin logo

Overview: We are seeking an experienced and organized Team Lead to join our logistics operations in the third-party payroll. The ideal candidate will be responsible for overseeing the daily operations of the logistics team, ensuring efficient order fulfillment, supply chain coordination, and team management. Key Responsibilities: 1. Team Management & Coordination: - Support customer service and delivery activities by coordinating and directing teams handling shipping, receiving, and storage of goods. - Focus on order fulfillment and supply chain coordination while managing team performance. - Ensure high levels of team productivity and adherence to company policies, including safety standards. 2. Operational Efficiency: - Develop strategies to maximize assets in logistics and inventory planning. - Resolve issues impacting operational progress and ensure smooth workflow. - Oversee daily operations, ensuring the team meets set KPIs and brand standards. 3. Scheduling & Payroll Management: - Accomplish resource allocation, scheduling, and coordination of staff to meet operational demands. - Ensure the accuracy of payroll for team members in alignment with working hours and attendance. 4. Technology & Process Management: - Develop custom RF transactions and conversion programs to optimize logistics operations. - Use RF scanners to pull products from stockroom and receiving areas to ensure smooth inventory flow. 5. Team Development: - Promote and mentor team leaders for career growth into higher roles. - Develop interns into leadership positions such as ETL (Extended Team Leaders). 6. Stock Management & Compliance: - Perform daily in-stocks using the PDA system to maintain accurate product counts. - Ensure that all logistics operations comply with company policies, corporate standards, and safety regulations. 7. Leadership & Safety: - Serve as the district assessor for Hardlines to ensure all stores meet corporate standards. - Act as a role model in promoting safety and productivity on the floor. Skills & Qualifications: * Proven experience in team handling within the logistics industry (5-6 years). * Strong organizational, time management, and problem-solving skills. * Proficiency with RF scanners and PDA systems. * Ability to manage multiple tasks and coordinate effectively under pressure. * Excellent interpersonal and leadership skills with a focus on team development. Show more Show less

Posted 19 hours ago

Apply

Exploring ETL Jobs in India

The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving tech industries and often have a high demand for ETL professionals.

Average Salary Range

The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.

Career Path

In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect

As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.

Related Skills

Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)

Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.

Interview Questions

Here are 25 interview questions that you may encounter in ETL job interviews:

  • What is ETL and why is it important? (basic)
  • Explain the difference between ETL and ELT processes. (medium)
  • How do you handle incremental loads in ETL processes? (medium)
  • What is a surrogate key in the context of ETL? (basic)
  • Can you explain the concept of data profiling in ETL? (medium)
  • How do you handle data quality issues in ETL processes? (medium)
  • What are some common ETL tools you have worked with? (basic)
  • Explain the difference between a full load and an incremental load. (basic)
  • How do you optimize ETL processes for performance? (medium)
  • Can you describe a challenging ETL project you worked on and how you overcame obstacles? (advanced)
  • What is the significance of data cleansing in ETL? (basic)
  • How do you ensure data security and compliance in ETL processes? (medium)
  • Have you worked with real-time data integration in ETL? If so, how did you approach it? (advanced)
  • What are the key components of an ETL architecture? (basic)
  • How do you handle data transformation requirements in ETL processes? (medium)
  • What are some best practices for ETL development? (medium)
  • Can you explain the concept of change data capture in ETL? (medium)
  • How do you troubleshoot ETL job failures? (medium)
  • What role does metadata play in ETL processes? (basic)
  • How do you handle complex transformations in ETL processes? (medium)
  • What is the importance of data lineage in ETL? (basic)
  • Have you worked with parallel processing in ETL? If so, explain your experience. (advanced)
  • How do you ensure data consistency across different ETL jobs? (medium)
  • Can you explain the concept of slowly changing dimensions in ETL? (medium)
  • How do you document ETL processes for knowledge sharing and future reference? (basic)

Closing Remarks

As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies