869 Data Ingestion Jobs - Page 23

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

At Guidewire, we take pride in supporting our customers" mission to safeguard the world's most valuable investments. Insurance plays a crucial role in protecting our homes, businesses, and other assets, providing aid in times of need caused by natural disasters or accidents. Our goal is to provide a platform that enables Property and Casualty (P&C) insurers to offer the necessary products and services for individuals to recover from life's most challenging events. We are seeking a product management professional to join our Analytics and Data Services (ADS) team at Guidewire. The ADS team is dedicated to defining and designing new capabilities for the insurance market through our cutting-edg...

Posted 3 months ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You should have 6-10 years of hands-on experience in Java development, focusing on building robust data processing components. Your proficiency should include working with Google Cloud Pub/Sub or similar streaming platforms like Kafka. You must be skilled in JSON schema design, data serialization, and handling structured data formats. As an experienced individual, you should be capable of designing BigQuery views optimized for performance, scalability, and ease of consumption. Your responsibilities will include enhancing and maintaining Java-based adapters to publish transactional data from the Optimus system to Google Pub/Sub. Implementing and managing JSON schemas for smooth and accurate d...

Posted 3 months ago

AI Match Score
Apply

1.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

As a SF Data Cloud Developer with 3-6 years of experience, you will be responsible for designing and implementing 1st and 3rd-party data ingestion from various sources such as web, CRM, mobile apps, and media platforms. You should have a strong understanding of Data Streams, Data Model Objects (DMOs), Calculated Insights, and Segmentation within Salesforce Data Cloud. It is preferable to have a Pharma background or familiarity with privacy frameworks like HIPAA. Additionally, being a SF Data Cloud Consultant Certified would be advantageous. Your role will involve designing and implementing Calculated Insights to derive metrics for personalization and segmentation. You should have an understa...

Posted 3 months ago

AI Match Score
Apply

10.0 - 15.0 years

0 Lacs

noida, uttar pradesh

On-site

You are an experienced OCI AI Architect who will be responsible for leading the design and deployment of Gen AI, Agentic AI, and traditional AI/ML solutions on Oracle Cloud. Your role will involve a deep understanding of Oracle Cloud Architecture, Gen AI, Agentic and AI/ML frameworks, data engineering, and OCI-native services. The ideal candidate will possess a combination of deep technical expertise in AI/ML and Gen AI over OCI along with domain knowledge in Finance and Accounting. Your key responsibilities will include designing, architecting, and deploying AI/ML and Gen AI solutions on OCI using native AI services, building agentic AI solutions using frameworks such as LangGraph, CrewAI, ...

Posted 3 months ago

AI Match Score
Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Snowflake Data Engineer1 Snowflake Data Engineer Overall Experience 5+ years of experience in Snowflake and Python. Experience of 5+ years in data preparation. BI projects to understand business requirements in BI context and understand data model to transform raw data into meaningful data using snowflake and Python. Designing and creating data models that define the structure and relationships of various data elements within the organization. This includes conceptual, logical, and physical data models, which help ensure data accuracy, consistency, and integrity. Designing data integration solutions that allow different systems and applications to share and exchange data seamlessly. This may...

Posted 3 months ago

AI Match Score
Apply

1.0 - 2.0 years

3 - 4 Lacs

Gurugram, Bengaluru

Work from Office

About the Role: Grade Level (for internal use): 08 S&P Global Mobility The Role: Data Engineer The Team We are the Research and Modeling team, driving innovation by building robust models and tools to support the Vehicle & Powertrain Forecast team. Our work includes all aspects of development of, and ongoing support for, our business line data flows, analyst modelling solutions and forecasts, new apps, new client-facing products, and many other work areas besides. We value ownership, adaptability, and a passion for learning, while fostering an environment where diverse perspectives and mentorship fuel continuous growth. The Impact We areseekinga motivated and talented Data Engineer to be a k...

Posted 3 months ago

AI Match Score
Apply

3.0 - 8.0 years

5 - 10 Lacs

Ahmedabad

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure...

Posted 3 months ago

AI Match Score
Apply

6.0 - 9.0 years

9 - 13 Lacs

Gurugram

Work from Office

Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes...

Posted 3 months ago

AI Match Score
Apply

4.0 - 9.0 years

6 - 10 Lacs

Vadodara

Work from Office

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill

Posted 3 months ago

AI Match Score
Apply

3.0 - 8.0 years

3 - 7 Lacs

Chennai

Work from Office

- Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure Synapse Analytics -...

Posted 3 months ago

AI Match Score
Apply

3.0 - 8.0 years

3 - 7 Lacs

Patna

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure...

Posted 3 months ago

AI Match Score
Apply

4.0 - 9.0 years

6 - 12 Lacs

Kanpur

Work from Office

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill

Posted 3 months ago

AI Match Score
Apply

4.0 - 9.0 years

6 - 12 Lacs

Ludhiana

Work from Office

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill

Posted 3 months ago

AI Match Score
Apply

3.0 - 8.0 years

5 - 10 Lacs

Mumbai

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure...

Posted 3 months ago

AI Match Score
Apply

4.0 - 9.0 years

6 - 10 Lacs

Patna

Work from Office

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill

Posted 3 months ago

AI Match Score
Apply

4.0 - 9.0 years

6 - 10 Lacs

Surat

Work from Office

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill

Posted 3 months ago

AI Match Score
Apply

3.0 - 8.0 years

3 - 7 Lacs

Nashik

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure...

Posted 3 months ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

At PwC, our team in managed services specializes in providing outsourced solutions and supporting clients across various functions. We help organizations enhance their operations, reduce costs, and boost efficiency by managing key processes and functions on their behalf. Our expertise lies in project management, technology, and process optimization, allowing us to deliver high-quality services to our clients. In managed service management and strategy at PwC, the focus is on transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and...

Posted 3 months ago

AI Match Score
Apply

3.0 - 8.0 years

3 - 7 Lacs

Lucknow

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure...

Posted 3 months ago

AI Match Score
Apply

4.0 - 9.0 years

6 - 10 Lacs

Kolkata

Work from Office

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill

Posted 3 months ago

AI Match Score
Apply

2.0 - 6.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Job Area: Engineering Group, Engineering Group > Mechanical Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Mechanical Engineer, you will design, analyze, troubleshoot, and test electro-mechanical systems and packaging. Qualcomm Engineers collaborate across functions to provide design information and complete project deliverables. Minimum Qualifications: Bachelor's degree in Mechanical Engineering or related field. Location Bangalore, India Job Overview The successful candidate will operate ...

Posted 3 months ago

AI Match Score
Apply

2.0 - 4.0 years

7 - 11 Lacs

Hyderabad

Remote

We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).

Posted 3 months ago

AI Match Score
Apply

2.0 - 4.0 years

7 - 11 Lacs

Mumbai

Remote

We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).

Posted 3 months ago

AI Match Score
Apply

2.0 - 4.0 years

7 - 11 Lacs

Kolkata

Remote

We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).

Posted 3 months ago

AI Match Score
Apply

2.0 - 4.0 years

7 - 11 Lacs

Bengaluru

Remote

We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).

Posted 3 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies