Jobs
Interviews

5 Etl Workflow Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

6 - 16 Lacs

bengaluru

Work from Office

About the Team/Role As a Data Engineer at WEX, youll be responsible for building and maintaining the bridge between our data and the rest of the organization. Youll work with stakeholders to understand business requirements and then implement SQL-first transformation workflows to deploy analytics code. You’ll help ensure the integrity, reliability, and usability of data for stakeholders, so they can make critical data-driven decisions. How you’ll make an impact A highly motivated individual who loves working as part of a high performing team. You are constantly learning and upskilling. You are a critical thinker with strong analytical and problem-solving abilities. You are self-motivated and able to work independently with minimum supervision. We are seeking a Data Engineer to play a critical role in the development of WEX's data & analytics capabilities. You will be part of an organization focusing on the development and delivery of data solutions. You’ll be part of a team that is responsible for: Creating optimized data pipelines. Working with stakeholders to understand business requirements and then implement transformation workflows. Designing efficient data marts that are catered towards the needs of very specific business units, functions, or departments. Creating and maintaining system configurations. Experience you’ll bring The successful candidate is motivated by data solutions, is technically proficient, and enjoys working in a fast-paced environment. You care deeply about the veracity (i.e. consistency, accuracy, quality, and trustworthiness) of data. You enjoy designing, maintaining, and optimizing data pipelines & infrastructure, for data collection, management, transformation, and access. In addition, you: Are a strong critical thinker with analytical and problem-solving abilities. Bring thought leadership to your area of responsibility and enjoy staying ahead in your field. You possess the following skills and experiences: Solid understanding of ETL tooling to perform data transformation tasks. Proven, hands-on experience developing complex ETL workflows using Informatica (PowerCenter and/or IICS). Understanding of data design principles and dimensional data modelling. SQL skills and understanding of query optimization strategies. It would be a bonus if you have: Familiarity with BI and reporting platforms, particularly SAP BusinessObjects (BO).

Posted 6 days ago

Apply

0.0 - 3.0 years

0 Lacs

karnataka

On-site

We are looking for a Python Developer with working knowledge of ETL workflow. Experience in data extraction using APIs and writing queries in PostgreSQL is mandatory. As a Python Developer, you will need to have good experience in Python programming and problem-solving. You should also be proficient in data structures and implementation, as well as in relational databases and SQL. A degree in Computer Science is required for this position. Additionally, strong communication, prioritization, and organization skills are essential. You should also have a keen interest in learning and upskilling. Your responsibilities will include Python programming, problem-solving, data structure implementation, database management, and meeting project requirements. You will be expected to have a degree in Computer Science, possess excellent communication and organization skills, and be committed to continuous learning and development. At GlobalLogic, we prioritize a culture of caring where people come first. You will experience an inclusive environment where you can build meaningful connections with your teammates, managers, and leaders. We are committed to your continuous learning and development, offering various opportunities to sharpen your skills and advance your career. You will have the chance to work on projects that matter and make an impact, using your problem-solving skills to help clients reimagine what's possible. We believe in the importance of balance and flexibility, offering different career paths and work arrangements to help you achieve a balance between work and life. As a high-trust organization, integrity is key, and you can trust us to provide a safe, reliable, and ethical work environment. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences. We collaborate with clients to transform businesses and redefine industries through intelligent products, platforms, and services.,

Posted 1 month ago

Apply

7.0 - 10.0 years

6 - 9 Lacs

Hyderabad, Telangana, India

On-site

Job description We are looking for an experienced Ab Initio Lead Engineer to lead the design, development, and implementation of data integration and ETL solutions using the Ab Initio platform. As a lead Engineer . Development Implementation: Design and develop robust, scalable ETL workflows and data integration solutions using Ab Initio tools such as GDE, Conductand Express Create reusable frameworks for data processing and transformations. Optimize ETL processes to ensure high performance and scalability. Technical Leadership: Lead a team of Engineers, providing technical guidance and ensuring adherence to coding and design standards. Review code, troubleshoot issues, and ensure the quality of deliverables. Solution Design: Collaborate with architects and business analysts to understand data requirements and translate them into technical solutions. Contribute to data modeling and architecture design for enterprise data systems. Required Skills Ab Initio, Ab Initio Lead, ETL Workflow, GDE, Data Intergration

Posted 1 month ago

Apply

0.0 - 3.0 years

0 Lacs

karnataka

On-site

We are looking for a Python Developer with working knowledge of ETL workflow and experience in data extraction using APIs and writing queries in PostgreSQL. The ideal candidate must have good experience in Python programming and problem-solving, be proficient in data structures and implementation, and possess knowledge of relational databases and SQL. A degree in Computer Science is required for this role. Additionally, strong communication, prioritization, and organizational skills are essential, along with a willingness to learn and upskill. As a Python Developer, your responsibilities will include Python programming, problem-solving, data structure implementation, database management, requirements analysis, and implementation. You will be expected to collaborate with cross-functional teams and demonstrate continuous learning and improvement in your work. At GlobalLogic, we offer a culture of caring where people come first. We prioritize inclusivity, acceptance, and belonging, fostering meaningful connections among teammates, managers, and leaders. Continuous learning and development opportunities are provided to help you grow personally and professionally. You will have the chance to work on interesting and impactful projects that challenge your problem-solving skills and creativity. We believe in providing a balanced and flexible work environment that allows you to achieve a harmonious work-life balance. Integrity and trust are fundamental values at GlobalLogic, ensuring a safe, reliable, and ethical work environment for all employees. GlobalLogic, a Hitachi Group Company, is a leading digital engineering partner to the world's largest companies. Since 2000, we have been driving the digital revolution by creating innovative digital products and experiences. Join us in transforming businesses and industries through intelligent products, platforms, and services.,

Posted 1 month ago

Apply

7.0 - 10.0 years

30 - 45 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities * Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture. * Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products. * Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions. * Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks. * Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency. * Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables. * Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering. * Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies