869 Data Ingestion Jobs - Page 27

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Minimum 6 years of hands-on experience in data engineering or big data development roles. Strong programming skills in Python and experience with Apache Spark (PySpark preferred). Proficient in writing and optimizing complex SQL queries. Hands-on experience with Apache Airflow for orchestration of data workflows. Deep understanding and practical experience with AWS services: Data Storage & ProcessingS3, Glue, EMR, Athena Compute & ExecutionLambda, Step Functions DatabasesRDS, DynamoDB MonitoringCloudWatch Experience with distributed data processing, parallel computing, and performance tuning. Strong analytical and problem-solving skills. Familiarity with CI/CD pipelines and DevOps practices ...

Posted 4 months ago

AI Match Score
Apply

8.0 - 13.0 years

4 - 8 Lacs

Hyderabad

Work from Office

This role will be instrumental in building and maintaining robust, scalable, and reliable data pipelines using Confluent Kafka, ksqlDB, Kafka Connect, and Apache Flink. The ideal candidate will have a strong understanding of data streaming concepts, experience with real-time data processing, and a passion for building high-performance data solutions. This role requires excellent analytical skills, attention to detail, and the ability to work collaboratively in a fast-paced environment. Essential Responsibilities Design & develop data pipelines for real time and batch data ingestion and processing using Confluent Kafka, ksqlDB, Kafka Connect, and Apache Flink. Build and configure Kafka Connec...

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

14 - 17 Lacs

Pune

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachel...

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

14 - 17 Lacs

Mumbai

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelo...

Posted 4 months ago

AI Match Score
Apply

10.0 - 15.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Key Responsibilities: Instrument Angular frontend and Java backend applications with GIL for effective logging and analytics. Design and implement client-side and server-side tracking mechanisms to capture key user and system activities. Aggregate, store, and process instrumentation data efficiently for reporting and analytics. Develop dashboards to summarize usage metrics, engagement patterns, and system health using modern visualization frameworks/tools. Create tracking for usage of different data sources (e.g., APIs, databases) and present metrics to business and technical stakeholders. Collaborate closely with product managers, UX designers, backend engineers, and data engineers to ident...

Posted 4 months ago

AI Match Score
Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

2+ years of implementation experience with Adobe Experience Cloud products especially Adobe Experience Platform and Journey Optimizer Expertise in deploying, configuring and optimizing all major Experience Platform services and Journey Optimizer features Strong SQL skills for querying datasets, implementing data transformations, cleansing data, etc Hands-on experience developing custom applications, workflows and integrations using Experience Platform APIs Deep familiarity with Adobe Experience Platform and Journey Optimizer technical implementations including: Setting up source connectors and ingesting data using the API and UI Configuring Experience Events (XDM schemas) for capturing data ...

Posted 4 months ago

AI Match Score
Apply

0.0 - 1.0 years

2 - 3 Lacs

Noida

Work from Office

As an intern, you will play a key role in supporting our data operations by handling Level 1 (L1) alert monitoring for both ingestion and analytics pipelines. Youll be responsible for performing L1 troubleshooting on assigned ingestion and analytics tasks as part of our Business-As-Usual (BAU) activities. This role also involves cross-collaboration with multiple teams to ensure timely resolution of issues and maintaining the smooth functioning of data workflows. Its a great opportunity to gain hands-on experience in real-time monitoring, issue triaging, and inter-team coordination in a production environment. A Day in the life Create world class customer facing documentation which would deli...

Posted 4 months ago

AI Match Score
Apply

5.0 - 8.0 years

7 - 10 Lacs

Chennai

Work from Office

Design, implement, and optimize Big Data solutions using Hadoop technologies. You will work on data ingestion, processing, and storage, ensuring efficient data pipelines. Strong expertise in Hadoop, HDFS, and MapReduce is essential for this role.

Posted 4 months ago

AI Match Score
Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI . Inviting applications for the role of Lead Consulta nt- Snowflake Data Engineer ( Python+Cloud ) ! In this role, the Snowflake Data Engineer is resp...

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

3 - 7 Lacs

Hyderabad

Work from Office

5+ Years of experience in developing Snowflake data models, data ingestion, views, Stored procedures, complex queries Good experience in SQL Experience in Informatica Power center / IICS ETL tools Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Provide production support for Data Warehouse issues such data load problems, transformation translation problems Ability to facilitate and coordinate discussion and to manage expectations of multiple stakeholders Candidate must have good communication and facilitation skills Work in an onsite-offshore model involving daily interactions with Onshore teams to ensure on-...

Posted 4 months ago

AI Match Score
Apply

10.0 - 15.0 years

10 - 15 Lacs

Pune

Work from Office

Provides technical expertise, to include addressing and resolving complex technical issues. Demonstrable experience assessing application workloads and technology landscape for Cloud suitability, develop case and Cloud adoption roadmap Expertise on data ingestion, data loading, Data Lake, bulk processing, transformation using Azure services and migrating on-premises services to various Azure environments. Good experience of a range services from the Microsoft Azure Cloud Platform including Infrastructure and Security related services such as Azure AD, IaaS, Containers, Storage, Networking and Azure Security. Good experience of enterprise solution shaping and Microsoft Azure Cloud architectur...

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

5 - 8 Lacs

Chennai

Work from Office

We are looking immediatefor SnowflakeDataWarehouse Engineers_ Contract_ Chennai:Snowflake Data Warehouse Engineers5+yearsChennaiPeriodImmediateTypeContractDescription:- We need an experienced, collaborative Snowflake Data Warehouse Engineers with 5+ Yrs of experience in developing Snowflake data models, data ingestion, views, Stored procedures, complex queries Good experience in SQL Experience in Informatica Powercenter / IICS ETL tools Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Provide production support for Data Warehouse issues such data load problems, transformation translation problems Ability to fa...

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Data Engineer ensuring the smooth functioning of our applications and data systems. Your expertise in Data Ingestion, Release Management, Monitorization, Incident Review, Databricks, Azure Cloud, and Data Analysis will be instrumental in maintaining the reliability, availability, and performance of our applications and data pipelines. You will collaborate closely with cross-functional teams to support application deployments, monitor system health, analyze data, and provide timely resolutions to incidents. The ideal candidate should have a strong background in Azure DevOps, Azure Cloud specially ADF, Databricks, and AWS Cloud. List of Key Responsibilities: Implement and manage data ingestion...

Posted 4 months ago

AI Match Score
Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Hybrid

We are seeking a skilled Database Specialist with strong expertise in Time-Series Databases, specifically Loki for logs, InfluxDB, and Splunk for metrics. The ideal candidate will have a solid background in query languages, Grafana, Alert Manager, and Prometheus. This role involves managing and optimizing time-series databases, ensuring efficient data storage, retrieval, and visualization. Key Responsibilities: Design, implement, and maintain time-series databases using Loki, InfluxDB, and Splunk to store and manage high-velocity time-series data. Develop efficient data ingestion pipelines for time-series data from various sources (e.g., IoT devices, application logs, metrics). Optimize data...

Posted 4 months ago

AI Match Score
Apply

4.0 - 9.0 years

10 - 20 Lacs

Bengaluru

Remote

Job Title: Software Engineer GCP Data Engineering Work Mode: Remote Base Location: Bengaluru Experience Required: 4 to 6 Years Job Summary: We are seeking a Software Engineer with a strong background in GCP Data Engineering and a solid understanding of how to build scalable data processing frameworks. The ideal candidate will be proficient in data ingestion, transformation, and orchestration using modern cloud-native tools and technologies. This role requires hands-on experience in designing and optimizing ETL pipelines, managing big data workloads, and supporting data quality initiatives. Key Responsibilities: Design and develop scalable data processing solutions using Apache Beam, Spark, a...

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Big Data Developer - Spark,Scala,Pyspark BigDataDeveloper - Spark, Scala, Pyspark Coding & scripting Years of Experience5 to 12 years LocationBangalore Notice Period0 to 30 days Key Skills: - Proficient in Spark,Scala,Pyspark coding & scripting - Fluent inbigdataengineering development using the Hadoop/Spark ecosystem - Hands-on experience inBigData - Good Knowledge of Hadoop Eco System - Knowledge of cloud architecture AWS -Dataingestion and integration into theDataLake u...

Posted 4 months ago

AI Match Score
Apply

4.0 - 9.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Roles and Responsibilities Develop product strategy, roadmap, and backlog to drive business growth. Collaborate with cross-functional teams to deliver high-quality products that meet customer needs. Analyze market trends, competitors, and customer feedback to inform product decisions. Ensure effective communication with stakeholders through regular updates on product progress. Desired Candidate Profile 4-9 years of experience in Product Management or related field (Analytics). Strong understanding of Agile methodology, Scrum framework, and SDLC life cycle. Proficiency in tools such as JIRA, BRD, Use Cases, User Stories, Data Ingestion, SQL.

Posted 4 months ago

AI Match Score
Apply

3.0 - 5.0 years

8 - 17 Lacs

Gurugram

Work from Office

Roles and Responsibilities : 1. Support effective clinical analysis through the development of enriched data, leveraging analytical experience to guide data selection, data visualization, and additional analysis as appropriate 2. Work with internal and external partners to develop data and requirements in support of high quality clinical analytics 3. Gather information for project-related research; analyze that information; and produce reports and analyses as appropriate 4. Work with engineering team on the development of new data, quality control of reports 5. Identify appropriate techniques for a given analysis; implement analysis through technical programming; assess results and adjust fo...

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

6 - 10 Lacs

Bengaluru, Karnataka

Work from Office

We are seeking a highly skilled GCP Data Engineer with experience in designing and developing data ingestion frameworks, real-time processing solutions, and data transformation frameworks using open-source tools. The role involves operationalizing open-source data-analytic tools for enterprise use, ensuring adherence to data governance policies, and performing root-cause analysis on data-related issues. The ideal candidate should have a strong understanding of cloud platforms, especially GCP, with hands-on expertise in tools such as Kafka, Apache Spark, Python, Hadoop, and Hive. Experience with data governance and DevOps practices, along with GCP certifications, is preferred.

Posted 4 months ago

AI Match Score
Apply

6.0 - 9.0 years

8 - 11 Lacs

Chennai

Work from Office

About the job : Role : Microsoft Fabric Data Engineer Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundat...

Posted 4 months ago

AI Match Score
Apply

2.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Job Area: Engineering Group, Engineering Group > Mechanical Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Mechanical Engineer, you will design, analyze, troubleshoot, and test electro-mechanical systems and packaging. Qualcomm Engineers collaborate across functions to provide design information and complete project deliverables. Minimum Qualifications: Bachelor's degree in Mechanical Engineering or related field and 2+ years of Mechanical Engineering or related work experience. OR Master's...

Posted 4 months ago

AI Match Score
Apply

2.0 - 5.0 years

0 - 3 Lacs

Mumbai, Pune

Work from Office

Salesforce data cloud Developer JD Key Responsibilities: Design, configure, and implement solutions within Salesforce Data Cloud to unify customer profiles across sources. Develop and maintain data ingestion pipelines , identity resolution rules, calculated insights, and activation targets. Collaborate with marketing, sales, analytics, and IT teams to define data requirements and use cases. Create and manage data streams and harmonization rules for ingesting data from Salesforce, cloud storage, and external systems. Implement and maintain segmentations and activation strategies using Data Cloud tools. Write and optimize SQL, JSON, and data transformation logic for calculated insights and uni...

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

5 - 9 Lacs

Kolkata

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : Fulltime 15 years qualificationRole and Responsibilities:1. Design, create, code, and support a variety of data pipelines and models on GCP cloud technology 2. Strong hand-on exposure to GCP services like BigQuery, Composer etc.3. Partner with business/data analysts, architects, and other key project stakeholders to deliver data requirements.4. Developing data integration and ETL (Extract, Transform, Load)...

Posted 4 months ago

AI Match Score
Apply

6.0 - 9.0 years

9 - 13 Lacs

Kolkata

Work from Office

Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes...

Posted 4 months ago

AI Match Score
Apply

10.0 - 12.0 years

25 - 30 Lacs

Bengaluru

Remote

Responsibilities : - Lead and implement end-to-end technical solutions within D365 Customer Insights (Data and Journeys) to meet diverse client requirements, from initial design to deployment and support. - Design and configure CI data unification processes, including data ingestion pipelines, matching and merging rules, and segmentation models to create comprehensive and actionable customer profiles. - Demonstrate deep expertise in data quality management and ensuring data integrity. - Proficiency in integrating data with CI-Data using various methods, including standard connectors, API calls, and custom ETL pipelines (Azure Data Factory, SSIS). - Experience with different data sources and ...

Posted 4 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies