77 Data Fusion Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

14 - 20 Lacs

hyderabad

Work from Office

Interested Candidates share updated cv to mail id dikshith.nalapatla@motivitylabs.com Job Title: GCP Data Engineer Overview: We are looking for a skilled GCP Data Engineer with 5 years of real hands-on experience in data ingestion, data engineering, data quality, data governance, and cloud data warehouse implementations using GCP data services. The ideal candidate will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment. Key Responsibilities: Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Develop and maintain data ingestion fra...

Posted 2 days ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

chennai, all india

On-site

As a Full Stack Data Engineer at our company, you will collaborate with Data Scientists and Product Development teams to create cutting-edge data products that align with our Company Objectives. Your responsibilities will include landing data, building new data products, enhancing existing ones, and collaborating with Analytics & Business partners to ensure solutions are production-ready. Key Responsibilities: - Utilize GCP services such as Big Query, Dataproc, Data Plex, DataFusion, Terraform, Tekton, Airflow, Cloud Storage, and Pub/Sub for data processing and management. - Demonstrate proficiency in Git or any other version control tool. - Possess 2+ years of coding experience in Python an...

Posted 3 days ago

AI Match Score
Apply

5.0 - 10.0 years

14 - 24 Lacs

hyderabad

Work from Office

Experience - 5+ Years Job Summary We are looking for an experienced Data Engineer/Analyst with a strong background in SQL and Python , and hands-on experience with at least one BI Tool. You will be responsible for designing and building scalable data pipelines, transforming complex datasets, and supporting analytics and reporting needs across the organization. Key Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines using SQL and Python. Optimize and troubleshoot complex SQL queries to ensure high performance and scalability. Build and maintain data models and data marts to support business analytics and decision-making. Collaborate with data analysts, data scientists, an...

Posted 1 week ago

AI Match Score
Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineering Engineer III at TekWissen in Chennai, you will play a crucial role in designing, building, and maintaining data solutions to efficiently collect, store, process, and analyze large volumes of data. Your responsibilities will include: - Collaborating with business and technology stakeholders to understand current and future data requirements - Designing, building, and maintaining reliable, efficient, and scalable data infrastructure for data collection, storage, transformation, and analysis - Planning, designing, building, and maintaining scalable data solutions such as data pipelines, data models, and applications to ensure efficient and reliable data workflow - Implemen...

Posted 1 week ago

AI Match Score
Apply

10.0 - 15.0 years

0 Lacs

chennai, tamil nadu, india

On-site

Description Job Title : Data Architect GCP. Location : Chennai/Hyderabad. Experience : 10-15 years overall | 5+ years in GCP Architecture. Budget : Open to discuss. Notice Period : Immediate joiner/ Serving notice with less than 60 days/Notice is less than 60 days. Responsibilities Understand customers overall data platform, business and IT priorities and success measures to design data solutions that drive business value. Apply technical knowledge to architect solutions, create data platform and roadmaps on GCP cloud that meet business and IT needs. Estimation and outlining of the solutions needed to implement cloud native architecture and migration from on-prem systems. Executing the hands...

Posted 1 week ago

AI Match Score
Apply

5.0 - 8.0 years

0 Lacs

bengaluru, karnataka, india

On-site

At Fossil Group , we are part of an international team that dares to dream, disrupt, and deliver innovative watches, jewelry, and leather goods to the world. We're committed to long-term value creation, driven by technology and our core values: Authenticity , Grit , Curiosity , Humor , and Impact . If you are a forward-thinker who thrives in a diverse, global setting, we want to hear from you. We are seeking a Senior Google Cloud Data Engineer to join our Global Data & Analytics team at Fossil Group in Bangalore . If you are passionate about building robust data pipelines and enjoy solving complex data challenges in the cloud, this could be the perfect role for you. In this dynamic and globa...

Posted 1 week ago

AI Match Score
Apply

1.0 - 5.0 years

0 Lacs

tamil nadu

On-site

Role Overview: As a Data Engineering Engineer II at TekWissen in Chennai, you will be responsible for designing, building, and maintaining data solutions to collect, store, process, and analyze large volumes of data efficiently and accurately. You will collaborate with business and technology stakeholders to understand data requirements and design scalable data infrastructure for data collection, storage, and analysis. Your role will also involve designing and maintaining data platforms like data warehouses and data lakes for structured and unstructured data. Key Responsibilities: - Collaborate with business and technology stakeholders to understand current and future data requirements - Des...

Posted 3 weeks ago

AI Match Score
Apply

6.0 - 11.0 years

4 - 7 Lacs

bengaluru, karnataka, india

On-site

Key Responsibilities Work on GCP-based Big Data deployments (Batch and Real-time) using components such as: BigQuery , Cloud Composer (Airflow) , Google Cloud Storage , Data Fusion , Dataflow , Dataproc , etc. Develop efficient data pipelines using Python and PySpark . Manage and operate solutions on Linux-based environments . Design and implement CI/CD pipelines for Big Data release deployments. Build log monitoring and alerting mechanisms for data pipelines and cloud workloads.

Posted 3 weeks ago

AI Match Score
Apply

8.0 - 13.0 years

0 - 0 Lacs

bangalore, noida, hyderabad

On-site

Maintain architecture principles, guidelines and standards Data Warehousing Programming Language: Python/Java Big Data Data Analytics GCP Services Relevant technology domains: Experience in designing & implementing solution in mentioned areas: Strong Google Cloud Platform Data Components BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc Technical Data Engineer who is strong on Data Warehousing, Big Data, Data Analytics Experience with developing software code in one or more languages such as Java and Python. Strong Google Cloud Platform Data Components BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc Demonstrate extensive skills and success in the impleme...

Posted 3 weeks ago

AI Match Score
Apply

6.0 - 11.0 years

10 - 20 Lacs

chennai, coimbatore, bengaluru

Work from Office

Design, develop, test, and maintain GCP-based data applications using Python, Terraform, BigQuery, and Cloud Run. Build scalable data solutions, APIs, and analytics pipelines in an agile environment. Required Candidate profile 6+ yrs IT & 4+ yrs in development with strong coding skills in Python & cloud (GCP) Hands-on with Terraform, BigQuery, Kafka, & data engineering frameworks Strong analytical and problem-solving skills

Posted 1 month ago

AI Match Score
Apply

5.0 - 6.0 years

22 - 27 Lacs

chennai

Work from Office

Were Hiring: Artificial Intelligence Specialist Immediate Joiners Location: Chennai, Sholinganallur, Elcot SEZ 600119 Job Type: Full-time | Permanent Experience: 5+ years in enterprise software & AI/LLM integration About the Role: Are you passionate about AI, LLMs, and Generative AI ? We are looking for a solution-oriented AI Specialist to transform business operations with innovative AI solutions. This is not a traditional ML engineering role —we are seeking someone who can strategically integrate AI capabilities into enterprise systems and deliver real business impact . Immediate joiners welcome! Key Responsibilities: AI Solution Strategy & Innovation Identify opportunities where AI/LLM in...

Posted 1 month ago

AI Match Score
Apply

8.0 - 11.0 years

9 - 19 Lacs

hyderabad

Work from Office

We are looking for a Datastage Developer Lead to join our data engineering team. The ideal candidate will have strong experience in IBM Infosphere DataStage, ETL development, and leading end-to-end data integration projects. The role requires hands-on development, team leadership, and close collaboration with business and technical stakeholders. Key Responsibilities: Lead the design, development, and implementation of ETL processes using IBM DataStage. Collaborate with business analysts and data architects to understand data requirements. Optimize, maintain, and troubleshoot existing DataStage jobs and data pipelines. Ensure high-quality deliverables through code reviews, unit testing, and d...

Posted 1 month ago

AI Match Score
Apply

4.0 - 9.0 years

4 - 5 Lacs

sholinganallur

Work from Office

Role & responsibilities Position Description: Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately Key Responsibilities: 1) Collaborate with business and technology stakeholders to understand current and future data requirements 2) Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis 3) Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and...

Posted 1 month ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

Role Overview: As a GCP Cloud Data Engineer in Kochi, your main responsibility will be to design and develop data pipelines using GCP tools such as Cloud Dataflow, Data Fusion, Pub/Sub, Dataproc, and Composer (Airflow). You will also need to focus on data modeling, warehousing, integration, performance optimization, automation, CI/CD, data governance, and security within the GCP environment. Key Responsibilities: - Build, optimize, and maintain scalable ETL/ELT pipelines using GCP tools like Cloud Dataflow, Data Fusion, Pub/Sub, Dataproc, and Composer (Airflow). - Design and implement efficient data models and schemas for BigQuery or other GCP data stores. - Ingest structured and unstructure...

Posted 1 month ago

AI Match Score
Apply

12.0 - 15.0 years

30 - 35 Lacs

noida, hyderabad, pune

Work from Office

Experience in designing & implementing solution in mentioned areas: Strong Google Cloud Platform Data Components – BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc

Posted 1 month ago

AI Match Score
Apply

5.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer at QualMinds, you will play a crucial role in maintaining and supporting existing MSSQL processes, pipelines, and schemas. Additionally, you will be responsible for migrating existing pipelines and processes to the cloud, specifically Snowflake and GCP. Your role will involve analyzing and organizing raw data sets to meet functional and non-functional requirements, as well as developing and testing new pipelines and processes in both MSSQL and cloud environments. Working closely with the data engineering team, you will design and implement scalable solutions to meet business needs and support analytics and reporting initiatives. Furthermore, you will collaborate with the s...

Posted 1 month ago

AI Match Score
Apply

8.0 - 13.0 years

4 - 9 Lacs

hyderabad

Remote

Role & responsibilities We are seeking a GCP Solutions Expert who can bridge the gap between business vision and technical execution. This role is not for a pure developer its for someone who understands end-to-end cloud architectures, can visualize solutions, and has experience guiding teams in migrating enterprise data platforms (especially Informatica GCP). Youll work closely with leadership to design, communicate, and document scalable GCP solutions across data engineering, automation, and analytics domains. Key Responsibilities • Understand high-level business problems and design solution architectures using GCP-native services (BigQuery, Dataflow, Dataproc, Composer, Cloud Functions, D...

Posted 1 month ago

AI Match Score
Apply

4.0 - 8.0 years

15 - 20 Lacs

chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below client is a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Data Engineering Engineer II Location: Chennai Work Type: Hybrid Position Description: Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and ...

Posted 1 month ago

AI Match Score
Apply

6.0 - 10.0 years

14 - 24 Lacs

bengaluru

Hybrid

Job Description: The Google Cloud DevOps Engineer will be responsible for automating infrastructure provisioning and configuration management using Terraform and Ansible. The role involves designing, implementing, and maintaining CI/CD pipelines on GCP using Azure DevOps. The ideal candidate will have extensive experience with GCP resources, particularly in data engineering, and possess strong scripting skills in Python and Bash. Responsibilities: Automate infrastructure provisioning and configuration management using Terraform and Ansible. Design, implement, and maintain CI/CD pipelines on GCP using Azure DevOps. Manage and optimize GCP resources, including Compute Engine, Data Fusion, Data...

Posted 1 month ago

AI Match Score
Apply

6.0 - 10.0 years

14 - 24 Lacs

pune

Hybrid

Job Description: The Google Cloud DevOps Engineer will be responsible for automating infrastructure provisioning and configuration management using Terraform and Ansible. The role involves designing, implementing, and maintaining CI/CD pipelines on GCP using Azure DevOps. The ideal candidate will have extensive experience with GCP resources, particularly in data engineering, and possess strong scripting skills in Python and Bash. Responsibilities: Automate infrastructure provisioning and configuration management using Terraform and Ansible. Design, implement, and maintain CI/CD pipelines on GCP using Azure DevOps. Manage and optimize GCP resources, including Compute Engine, Data Fusion, Data...

Posted 1 month ago

AI Match Score
Apply

10.0 - 18.0 years

15 - 25 Lacs

chennai

Work from Office

POSITION TITLE: Lead MSBI Developer / Data Engineer This is a senior, hands-on role for a technical specialist focused on BI platform modernization. The primary mission is to lead the end-to-end analysis of our legacy Microsoft SSAS cubes and SSIS ETL workflows, create definitive technical documentation, and then use that knowledge to support the migration to a modern cloud data platform (GCP). Skills Required: Demonstrated ability to document complex systems , Ability to communicate and work with cross-functional teams and all levels of management , Microsoft Sql Servers, MSSQL, ETL Skills Preferred: Cloud Composer, Airflow PySpark, Big Query,, Google Cloud Platform - Biq Query, Data Flow, ...

Posted 1 month ago

AI Match Score
Apply

5.0 - 12.0 years

0 Lacs

karnataka

On-site

As a highly skilled GCP Cloud Solution Architect with expertise in Oracle systems, Oracle Data Integration (ODI), BigQuery, and Google Vertex AI Search, your role will involve owning the end-to-end solution architecture, driving data integration flows, and providing technical leadership and hands-on delivery. Your key responsibilities include: - **Solution Ownership & Architecture**: - Define and own the end-to-end architecture for integrating Oracle product data with Google Vertex AI Search. - Design, document, and govern data pipelines for ingestion, transformation, and indexing of product data. - Establish best practices for scalability, availability, and performance of the search solutio...

Posted 1 month ago

AI Match Score
Apply

5.0 - 7.0 years

0 Lacs

itanagar, arunachal pradesh, india

Remote

Job Description We are looking for an experienced GCP Data Engineer with a minimum of 5+ years of professional experience in data engineering, cloud-based data solutions, and large-scale distributed systems. This role is fully remote and requires a hands-on professional who can design, build, and optimize data pipelines and solutions on Google Cloud Platform (GCP). Key Responsibilities Architect, design, and implement highly scalable data pipelines and ETL workflows leveraging GCP services. Develop and optimize data ingestion, transformation, and storage frameworks to support analytical and operational workloads. Work extensively with BigQuery, Dataflow, Pub/Sub, Dataproc, Data Fusion, Cloud...

Posted 2 months ago

AI Match Score
Apply

8.0 - 10.0 years

22 - 25 Lacs

chennai

Work from Office

Must have Experience in Python, dataflow, Data proc. Must have experience in GCP, Data Form. Experience with Agile Software Development. Must have Experience in Big Query, TERRAFORM, Data Fusion. Good to have Experience in Cloud SQL, GCP, KAFKA. Contact Person: Kathiravan G Email ID: kathiravan@gojobs.biz

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Software Engineer Practitioner at TekWissen in Chennai, you will be a crucial part of the team responsible for developing and maintaining the Enterprise Data Platform. Your main focus will be on designing, constructing, and enhancing scalable data pipelines within the Google Cloud Platform (GCP) ecosystem. By leveraging GCP Native technologies such as BigQuery, Dataform, Dataflow, and Pub/Sub, you will ensure data governance, security, and optimal performance. This role provides you with the opportunity to apply your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at the client. Key Responsibilities: - Design, build, and optimize ...

Posted 2 months ago

AI Match Score
Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies