Jobs
Interviews

4 Etl Workflow Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 3.0 years

0 Lacs

karnataka

On-site

We are looking for a Python Developer with working knowledge of ETL workflow. Experience in data extraction using APIs and writing queries in PostgreSQL is mandatory. As a Python Developer, you will need to have good experience in Python programming and problem-solving. You should also be proficient in data structures and implementation, as well as in relational databases and SQL. A degree in Computer Science is required for this position. Additionally, strong communication, prioritization, and organization skills are essential. You should also have a keen interest in learning and upskilling. Your responsibilities will include Python programming, problem-solving, data structure implementation, database management, and meeting project requirements. You will be expected to have a degree in Computer Science, possess excellent communication and organization skills, and be committed to continuous learning and development. At GlobalLogic, we prioritize a culture of caring where people come first. You will experience an inclusive environment where you can build meaningful connections with your teammates, managers, and leaders. We are committed to your continuous learning and development, offering various opportunities to sharpen your skills and advance your career. You will have the chance to work on projects that matter and make an impact, using your problem-solving skills to help clients reimagine what's possible. We believe in the importance of balance and flexibility, offering different career paths and work arrangements to help you achieve a balance between work and life. As a high-trust organization, integrity is key, and you can trust us to provide a safe, reliable, and ethical work environment. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences. We collaborate with clients to transform businesses and redefine industries through intelligent products, platforms, and services.,

Posted 1 day ago

Apply

7.0 - 10.0 years

6 - 9 Lacs

Hyderabad, Telangana, India

On-site

Job description We are looking for an experienced Ab Initio Lead Engineer to lead the design, development, and implementation of data integration and ETL solutions using the Ab Initio platform. As a lead Engineer . Development Implementation: Design and develop robust, scalable ETL workflows and data integration solutions using Ab Initio tools such as GDE, Conductand Express Create reusable frameworks for data processing and transformations. Optimize ETL processes to ensure high performance and scalability. Technical Leadership: Lead a team of Engineers, providing technical guidance and ensuring adherence to coding and design standards. Review code, troubleshoot issues, and ensure the quality of deliverables. Solution Design: Collaborate with architects and business analysts to understand data requirements and translate them into technical solutions. Contribute to data modeling and architecture design for enterprise data systems. Required Skills Ab Initio, Ab Initio Lead, ETL Workflow, GDE, Data Intergration

Posted 5 days ago

Apply

0.0 - 3.0 years

0 Lacs

karnataka

On-site

We are looking for a Python Developer with working knowledge of ETL workflow and experience in data extraction using APIs and writing queries in PostgreSQL. The ideal candidate must have good experience in Python programming and problem-solving, be proficient in data structures and implementation, and possess knowledge of relational databases and SQL. A degree in Computer Science is required for this role. Additionally, strong communication, prioritization, and organizational skills are essential, along with a willingness to learn and upskill. As a Python Developer, your responsibilities will include Python programming, problem-solving, data structure implementation, database management, requirements analysis, and implementation. You will be expected to collaborate with cross-functional teams and demonstrate continuous learning and improvement in your work. At GlobalLogic, we offer a culture of caring where people come first. We prioritize inclusivity, acceptance, and belonging, fostering meaningful connections among teammates, managers, and leaders. Continuous learning and development opportunities are provided to help you grow personally and professionally. You will have the chance to work on interesting and impactful projects that challenge your problem-solving skills and creativity. We believe in providing a balanced and flexible work environment that allows you to achieve a harmonious work-life balance. Integrity and trust are fundamental values at GlobalLogic, ensuring a safe, reliable, and ethical work environment for all employees. GlobalLogic, a Hitachi Group Company, is a leading digital engineering partner to the world's largest companies. Since 2000, we have been driving the digital revolution by creating innovative digital products and experiences. Join us in transforming businesses and industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

7.0 - 10.0 years

30 - 45 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities * Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture. * Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products. * Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions. * Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks. * Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency. * Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables. * Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering. * Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies