19 Etlelt Workflows Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

Role Overview: You are a skilled Looker Developer with 6 to 9 years of experience, having a strong background in Google Cloud Platform (GCP). Your primary responsibility will be to design, build, and optimize scalable data models, dashboards, and analytics solutions. Your expertise in LookML, data visualization, and GCP data services such as BigQuery, Cloud Storage, and Dataflow will be crucial for this role. Key Responsibilities: - Develop and maintain Looker dashboards, Looks, and Explores to provide valuable business insights. - Create and optimize LookML models, views, and derived tables. - Collaborate with business and data engineering teams to understand reporting needs and transform t...

Posted 2 days ago

AI Match Score
Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Sr. Data Engineer - Data Modeler for our client, a FTSE 250 global fintech company headquartered in London, you will be responsible for creating and implementing logical and physical data models to support structured and unstructured data across various platforms. Your role will involve collaborating with data architects, business analysts, and data consumers to ensure that data models meet business requirements and adhere to industry best practices. - Create and implement logical and physical data models across RDBMS and big data platforms. - Design and implement semantic data models to optimize data accessibility, performance, and usability for Business Intelligence (BI) and analytics...

Posted 2 weeks ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Data Analyst QA Engineer for a remote position, your role will involve ensuring data accuracy across systems and establishing scalable QA processes for enterprise data platforms. You will leverage your expertise in SQL, data pipelines, ETL/ELT workflows, and Snowflake automation to contribute to the success of the organization. Key Responsibilities: - Validate data pipelines, ETL/ELT workflows, and warehouse transformations. - Write and optimize complex SQL queries for data testing. - Perform source-to-target data validation and accuracy checks. - Develop Snowflake automation for data quality assurance. - Define and execute QA processes to ensure data reliability. - Collaborate with cro...

Posted 3 weeks ago

AI Match Score
Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

Role Overview: As a Snowflake Advisor at Fiserv, you will be responsible for leading the data warehousing strategy, implementation, maintenance, and support. You will design, develop and implement Snowflake-based solutions to ensure scalable, efficient, and secure data systems that support business analytics and decision-making processes. Collaboration with cross-functional teams and acting as the subject matter expert for Snowflake will be key aspects of your role. Key Responsibilities: - Define and implement best practices for data modeling, schema design, and query optimization in Snowflakes - Develop and manage ETL/ELT workflows to ingest, transform, and load data into Snowflakes from va...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 8.0 years

0 Lacs

maharashtra

On-site

As a Palantir Foundry Developer, your role will involve designing, developing, and maintaining data pipelines, applications, and reports within the Palantir Foundry platform. You should have over 8 years of IT experience, with at least 2 years of specific experience in working with Palantir and Python. Your primary focus will be on leveraging the Workshop application to create user interfaces, dashboards, and workflows that facilitate data-driven decision-making. Collaboration with cross-functional teams is crucial to deliver high-quality data solutions while maintaining performance and data integrity. Responsibilities: - Develop and maintain data pipelines using Python, PySpark, and SQL wit...

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Engineer at our organization, you will play a crucial role in our data team by utilizing your expertise in Python and PySpark to design, develop, and maintain scalable data pipelines and infrastructure. Your responsibilities will involve powering our analytics and machine learning initiatives through the creation of robust and high-performance data workflows. - Design and implement data pipelines using PySpark and Python - Develop ETL/ELT workflows for data ingestion, transformation, and loading - Optimize data processing jobs for enhanced performance and cost-efficiency in distributed environments - Collaborate with data scientists, analysts, and business stakeholders to comprehen...

Posted 1 month ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced IICS Developer, you will play a vital role in supporting a critical data migration project from Oracle to Snowflake. This remote position requires you to work night-shift hours to synchronize with the U.S. team. Your main responsibility will involve developing and optimizing ETL/ELT workflows, working closely with architects/DBAs for schema conversion, and ensuring data quality, consistency, and validation throughout the migration process. Key Responsibilities: - Utilize your strong hands-on experience with IICS (Informatica Intelligent Cloud Services) to build mappings, tasks, and parameter files. - Collaborate with architects/DBAs for schema conversion and ensure data qua...

Posted 1 month ago

AI Match Score
Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Tech Lead, Data Architecture at Fiserv, you will play a crucial role in the data warehousing strategy and implementation. Your responsibilities will include: - Designing, developing, and leading the adoption of Snowflake-based solutions for efficient and secure data systems - Defining and implementing best practices for data modeling, schema design, and query optimization in Snowflake - Developing and managing ETL/ELT workflows to ingest, transform, and load data from various resources into Snowflake - Monitoring and tuning Snowflake performance, managing caching, clustering, and partitioning for efficiency - Collaborating with cross-functional teams and ensuring seamless integration wi...

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Senior AI Engineer at Uplevyl, you will play a crucial role in leading the design and deployment of AI-powered, agentic workflows that drive the future of personalized insights. Your main focus will be on vector search, retrieval-augmented generation (RAG), and intelligent automation, collaborating closely with full-stack engineers and product teams to bring scalable GenAI features into production. Key Responsibilities: - Design and implement RAG pipelines for semantic search, personalization, and contextual enrichment. - Build agentic AI workflows using Pinecone, LangChain/LangGraph, and custom orchestration. - Integrate LLM-driven features into production systems, balancing innovation...

Posted 1 month ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Technical Data Engineer at our organization, you will be responsible for developing robust ETL/ELT workflows to ingest, transform, and load data into our data warehouse, Azure Synapse Analytics using Informatica. Your role will involve contributing to the data ecosystem by performing exploratory data analysis (EDA) to validate pipeline outputs and ensure data accuracy. You will take ownership of data quality by implementing proactive checks and monitoring to maintain data integrity and reliability. Writing and optimizing complex SQL queries to support the data needs of our analytical and reporting teams will also be a key part of your responsibilities. Additionally, collaborating with d...

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled Data Modeler with expertise in Iceberg and Snowflake, responsible for designing and optimizing data models for scalable and efficient data architectures. Working closely with cross-functional teams, you ensure data integrity, consistency, and performance across platforms. Your key responsibilities include designing and implementing robust data models tailored to meet business and technical requirements. Leveraging Starburst, Iceberg, and Snowflake, you build scalable and high-performance data architectures. You optimize query performance and ensure efficient data storage strategies. Collaboration with data engineering and BI teams is essential to define data requirem...

Posted 1 month ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

chandigarh

On-site

As a Data Architect with over 6 years of experience, you will be responsible for designing and implementing modern data lakehouse architectures on cloud platforms such as AWS, Azure, or GCP. Your primary focus will be on defining data modeling, schema evolution, partitioning, and governance strategies to ensure high-performance and secure data access. In this role, you will own the technical roadmap for scalable data platform solutions, ensuring they are aligned with enterprise needs and future growth. You will also provide architectural guidance and conduct code/design reviews across data engineering teams to maintain high standards of quality. Your responsibilities will include building an...

Posted 1 month ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You are looking for a GCP Cloud Engineer for a position based in Pune. As a GCP Data Engineer, you will be responsible for designing, implementing, and optimizing data solutions on Google Cloud Platform. Your expertise in GCP services, solution design, and programming skills will be crucial for developing scalable and efficient cloud solutions. Your key responsibilities will include designing and implementing GCP-based data solutions following best practices, developing workflows and pipelines using Cloud Composer and Apache Airflow, building and managing data processing clusters using Dataproc, working with GCP services like Cloud Functions, Cloud Run, and Cloud Storage, and integrating mul...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are an experienced IICS (Informatica Intelligent Cloud Services) Developer with a strong background in the IICS platform. You possess in-depth knowledge of Snowflake and excel in creating and managing integrations across various systems and databases. Your role involves collaborating on cloud-based integration solutions, ensuring seamless data flow between platforms, and optimizing performance for large-scale data processes. Your primary responsibilities include designing, developing, and implementing data integration solutions using IICS. You will work extensively with Snowflake data warehouse solutions, handling tasks such as data loading, transformation, and querying. Building, monito...

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

We empower our people to stay resilient and relevant in a constantly changing world. We are looking for individuals who are always seeking creative ways to grow and learn, individuals who aspire to make a real impact, both now and in the future. If this resonates with you, then you would be a valuable addition to our dynamic international team. We are currently seeking a Senior Software Engineer - Data Engineer (AI Solutions). In this role, you will have the opportunity to: - Design, build, and maintain data pipelines to cater to the requirements of various stakeholders, including software developers, data scientists, analysts, and business teams. - Ensure that the data pipelines are modular...

Posted 2 months ago

AI Match Score
Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Tech Lead, Data Architecture at Fiserv, you will play a crucial role in our data warehousing strategy and implementation. Your responsibilities will include designing, developing, and leading the adoption of Snowflake-based solutions to ensure efficient and secure data systems that drive our business analytics and decision-making processes. Collaborating with cross-functional teams, you will define and implement best practices for data modeling, schema design, and query optimization in Snowflake. Additionally, you will develop and manage ETL/ELT workflows to ingest, transform, and load data from various resources into Snowflake, integrating data from diverse systems like databases, APIs...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Engineer at our organization, you will play a crucial role in our data team by utilizing your expertise in Python and PySpark to design, develop, and maintain scalable data pipelines and infrastructure. Your responsibilities will involve powering our analytics and machine learning initiatives through the creation of robust and high-performance data workflows. With a minimum of 5 years of experience in data engineering or a related field, you will be expected to design and implement data pipelines using PySpark and Python, develop ETL/ELT workflows for data ingestion, transformation, and loading, and optimize data processing jobs for enhanced performance and cost-efficiency in distr...

Posted 3 months ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced IICS Developer, you will be responsible for supporting a critical data migration project from Oracle to Snowflake. This remote opportunity requires working night-shift hours to align with the U.S. team. Your primary focus will be on developing and optimizing ETL/ELT workflows, collaborating with architects/DBAs for schema conversion, and ensuring data quality, consistency, and validation throughout the migration process. To excel in this role, you must possess strong hands-on experience with IICS (Informatica Intelligent Cloud Services), a solid background in Oracle databases (including SQL, PL/SQL, and data modeling), and a working knowledge of Snowflake, specifically data...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Engineer at our company, you will play a crucial role in our data team by designing, building, and maintaining scalable data pipelines and infrastructure using your expertise in Python and PySpark. With a focus on supporting our analytics and machine learning initiatives, you will collaborate with various stakeholders to ensure data quality, integrity, and governance across all pipelines. You should have at least 5 years of experience in data engineering or a related field to excel in this role. Your responsibilities will include developing ETL/ELT workflows, optimizing data processing jobs for performance and cost-efficiency, and monitoring production data workflows to proactively...

Posted 3 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies