875 Data Ingestion Jobs - Page 31

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 10.0 years

12 - 14 Lacs

Hyderabad

Work from Office

Responsibilities: * Design, develop & maintain data pipelines using Airflow/Data Flow/Data Lake * Optimize performance & scalability of ETL processes with SQL & Python

Posted 5 months ago

AI Match Score
Apply

3.0 - 5.0 years

3 - 7 Lacs

Gurugram

Work from Office

About the Opportunity Job TypeApplication 23 June 2025 Title Expert Engineer Department GPS Technology Location Gurugram, India Reports To Project Manager Level Grade 4 Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like your part of something bigger. About your team The Technology function provides IT services to the Fidelity International business, globally. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, leg...

Posted 5 months ago

AI Match Score
Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

JOB DESCRIPTION We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server- side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy- first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, anal...

Posted 5 months ago

AI Match Score
Apply

4.0 - 9.0 years

6 - 11 Lacs

Hyderabad

Work from Office

What you will do In this vital role you will responsible for designing, developing, and maintaining software applications and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, data engineers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. You will play a key role in a regulatory submission content automation initiative which will modernize and digitize the regulatory submission process, positioning Amgen as a leader in regulatory ...

Posted 5 months ago

AI Match Score
Apply

5.0 - 7.0 years

32 - 40 Lacs

Bengaluru

Work from Office

Design, develop, and optimize large-scale data processing pipelines using PySpark. Work with various Apache tools and frameworks (like Hadoop, Hive, HDFS, etc.) to ingest, transform, and manage large datasets.

Posted 5 months ago

AI Match Score
Apply

4.0 - 9.0 years

6 - 10 Lacs

Bengaluru

Work from Office

About the Role: We are seeking a skilled and detail-oriented Data Migration Specialist with hands-on experience in Alteryx and Snowflake. The ideal candidate will be responsible for analyzing existing Alteryx workflows, documenting the logic and data transformation steps and converting them into optimized, scalable SQL queries and processes in Snowflake. The ideal candidate should have solid SQL expertise, a strong understanding of data warehousing concepts. This role plays a critical part in our cloud modernization and data platform transformation initiatives. Key Responsibilities: Analyze and interpret complex Alteryx workflows to identify data sources, transformations, joins, filters, agg...

Posted 5 months ago

AI Match Score
Apply

4.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingesti...

Posted 5 months ago

AI Match Score
Apply

5.0 - 8.0 years

2 - 5 Lacs

Chennai

Work from Office

Job Information Job Opening ID ZR_2168_JOB Date Opened 10/04/2024 Industry Technology Job Type Work Experience 5-8 years Job Title AWS Data Engineer City Chennai Province Tamil Nadu Country India Postal Code 600002 Number of Positions 4 Mandatory Skills: AWS, Python, SQL, spark, Airflow, SnowflakeResponsibilities Create and manage cloud resources in AWS Data ingestion from different data sources which exposes data using different technologies, such asRDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies Data processing/transformation using various technologies such a...

Posted 5 months ago

AI Match Score
Apply

5.0 - 8.0 years

2 - 6 Lacs

Mumbai

Work from Office

Job Information Job Opening ID ZR_1963_JOB Date Opened 17/05/2023 Industry Technology Job Type Work Experience 5-8 years Job Title Neo4j GraphDB Developer City Mumbai Province Maharashtra Country India Postal Code 400001 Number of Positions 5 Graph data Engineer required for a complex Supplier Chain Project. Key required Skills Graph data modelling (Experience with graph data models (LPG, RDF) and graph language (Cypher), exposure to various graph data modelling techniques) Experience with neo4j Aura, Optimizing complex queries. Experience with GCP stacks like BigQuery, GCS, Dataproc. Experience in PySpark, SparkSQL is desirable. Experience in exposing Graph data to visualisation tools such ...

Posted 5 months ago

AI Match Score
Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad

Hybrid

Urgent Requirement for Grafana, Employment:C2H Notice Period:Immediate We are seeking a skilled Database Specialist with strong expertise in Time-Series Databases, specifically Loki for logs, InfluxDB, and Splunk for metrics. The ideal candidate will have a solid background in query languages, Grafana, Alert Manager, and Prometheus. This role involves managing and optimizing time-series databases, ensuring efficient data storage, retrieval, and visualization. Key Responsibilities: Design, implement, and maintain time-series databases using Loki, InfluxDB, and Splunk to store and manage high-velocity time-series data. Develop efficient data ingestion pipelines for time-series data from variou...

Posted 5 months ago

AI Match Score
Apply

8.0 - 13.0 years

30 - 45 Lacs

Bengaluru

Hybrid

Job Title: Enterprise Data Architect | Immediate Joiner Experience: 8 15 Years Location: Bengaluru (Onsite/Hybrid) Joining Time: Immediate Joiners Only (015 Days) Job Description We are looking for an experienced Enterprise Data Architect to join our dynamic team in Bengaluru. This is an exciting opportunity to shape modern data architecture across finance and colleague (HR) domains using the latest technologies and design patterns. Key Responsibilities Design and implement conceptual and logical data models for finance and colleague domains. Define complex as-is and to-be data architectures, including transition states. Develop and maintain data standards, principles, and architecture artif...

Posted 5 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 - 3 Lacs

Hyderabad, Pune, Chennai

Work from Office

Position : Azure Data Engineer Locations : Bangalore, Pune, Hyderabad, Chennai & Coimbatore Key skills Azure Data bricks, Azure Data Factory, Hadoop Relevant Exp : ADF, ADLF, Databricks- 4 Yrs Only Hadoop- 3.5 or 3 Yrs Experience - 5 Years Must-have skills: Cloud certified in one of these categories • Azure Data Engineer • Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation . Semantic Modelling/ Optimization of data model to work within Rahona • Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle. • Experience in Sqoop / Hadoop • Microsoft Excel (for metadata files with requirements for ingestion) • Any other certif...

Posted 5 months ago

AI Match Score
Apply

5.0 - 9.0 years

8 - 14 Lacs

Kolkata

Work from Office

Key Responsibilities: Splunk ITSI Implementation: Develop and configure IT Service Intelligence (ITSI) modules, including KPI creation, service trees, and notable event aggregation. SIEM Development: Design, implement, and optimize Splunk SIEM solutions for threat detection, security monitoring, and log analysis. Dashboard & Visualization: Create advanced dashboards, reports, and visualizations using Splunk SPL (Search Processing Language). Data Ingestion & Parsing: Develop data onboarding, parsing, and field extractions from various log sources, including cloud and on-prem infrastructure.

Posted 5 months ago

AI Match Score
Apply

5.0 - 9.0 years

8 - 14 Lacs

Ludhiana

Work from Office

Key Responsibilities: Splunk ITSI Implementation: Develop and configure IT Service Intelligence (ITSI) modules, including KPI creation, service trees, and notable event aggregation. SIEM Development: Design, implement, and optimize Splunk SIEM solutions for threat detection, security monitoring, and log analysis. Dashboard & Visualization: Create advanced dashboards, reports, and visualizations using Splunk SPL (Search Processing Language). Data Ingestion & Parsing: Develop data onboarding, parsing, and field extractions from various log sources, including cloud and on-prem infrastructure.

Posted 5 months ago

AI Match Score
Apply

8.0 - 10.0 years

27 - 42 Lacs

Chennai

Work from Office

Job Summary: We are seeking a skilled and motivated Backend/Data Engineer with hands-on experience in MongoDB and Neo4j to design and implement data-driven applications. The ideal candidate will be responsible for building robust database systems, integrating complex graph and document-based data models, and collaborating with cross-functional teams. Experience - 6- 12 years Key Responsibilities: • Design, implement, and optimize document-based databases using MongoDB. • Model and manage connected data using Neo4j (Cypher query language). • Develop RESTful APIs and data services to serve and manipulate data stored in MongoDB and Neo4j. • Implement data pipelines for data ingestion, transform...

Posted 5 months ago

AI Match Score
Apply

4.0 - 9.0 years

16 - 27 Lacs

Hyderabad, Bengaluru

Work from Office

Preferred candidate profile Strong knowledge and experience with the Power BI ecosystem (Power BI Desktop, Power Query, DAX, Power BI Service, etc) Should be able to propose and Design the high performing Power BI data models to cater to various functional areas of client. Should be able to design and develop detailed Power BI reports, and use various visualizations to develop summary and detailed reports. Should be very good with DAX functions and could create complex DAX queries, and M Query. Should be able to provide quick solutions to the issues raised by business users. Should have proven capability in using various Power BI features like bookmarks, Drill through, query merge etc 4+ yea...

Posted 5 months ago

AI Match Score
Apply

4.0 - 8.0 years

10 - 20 Lacs

Kolkata, Gurugram, Bengaluru

Work from Office

Job Opportunity for GCP Data Engineer Role: Data Engineer Location: Gurugram/ Bangalore/Kolkata (5 Days work from office) Experience : 4+ Years Key Skills: Data Analysis / Data Preparation - Expert Dataset Creation / Data Visualization - Expert Data Quality Management - Advanced Data Engineering - Advanced Programming / Scripting - Intermediate Data Storytelling- Intermediate Business Analysis / Requirements Analysis - Intermediate Data Dashboards - Foundation Business Intelligence Reporting - Foundation Database Systems - Foundation Agile Methodologies / Decision Support - Foundation Technical Skills: • Cloud - GCP - Expert • Database systems (SQL and NoSQL / Big Query / DBMS) - Expert • Da...

Posted 5 months ago

AI Match Score
Apply

4.0 - 8.0 years

10 - 20 Lacs

Pune, Delhi / NCR, Mumbai (All Areas)

Hybrid

Job Title: Data Engineer - Ingestion, Storage & Streaming (Confluent Kafka) Job Summary: As a Data Engineer specializing in Ingestion, Storage, and Streaming, you will design, implement, and maintain robust, scalable, and high-performance data pipelines for the efficient flow of data through our systems. You will work with Confluent Kafka to build real-time data streaming platforms, ensuring high availability and fault tolerance. You will also ensure that data is ingested, stored, and processed efficiently and in real-time to provide immediate insights. Key Responsibilities: Kafka-Based Streaming Solutions: Design, implement, and manage scalable and fault-tolerant data streaming platforms us...

Posted 5 months ago

AI Match Score
Apply

5.0 - 8.0 years

9 - 14 Lacs

Bengaluru, Bangalaore

Work from Office

ETL Data Engineer - Tech Lead Bangalore, India Information Technology 16748 Overview We are seeking a skilled and experienced Data Engineer who has expertise in playing a vital role in supporting data discovery, creating design document, data ingestion/migration, creating data pipelines, creating data marts and managing, monitoring the data using tech stack Azure, SQL, Python, PySpark, Airflow and Snowflake. Responsibilities 1. Data DiscoveryCollaborate with source teams and gather complete details of data sources and create design diagram. 2. Data Ingestion/MigrationCollaborate with cross-functional teams to Ingest/migrate data from various sources to staging area. Develop and implement eff...

Posted 5 months ago

AI Match Score
Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Back At BCE Global Tech, immerse yourself in exciting projects that are shaping the future of both consumer and enterprise telecommunications This involves building innovative mobile apps to enhance user experiences and enable seamless connectivity on-the-go Thrive in diverse roles like Full Stack Developer, Backend Developer, UI/UX Designer, DevOps Engineer, Cloud Engineer, Data Science Engineer, and Scrum Master; at a workplace that encourages you to freely share your bold and different ideas If you are passionate about technology and eager to make a difference, we want to hear from you! Apply now to join our dynamic team in Bengaluru ETL Data Stage Specialist Join our dynamic team as an E...

Posted 5 months ago

AI Match Score
Apply

5.0 - 9.0 years

8 - 14 Lacs

Nagpur

Work from Office

Key Responsibilities: Splunk ITSI Implementation: Develop and configure IT Service Intelligence (ITSI) modules, including KPI creation, service trees, and notable event aggregation. SIEM Development: Design, implement, and optimize Splunk SIEM solutions for threat detection, security monitoring, and log analysis. Dashboard & Visualization: Create advanced dashboards, reports, and visualizations using Splunk SPL (Search Processing Language). Data Ingestion & Parsing: Develop data onboarding, parsing, and field extractions from various log sources, including cloud and on-prem infrastructure.

Posted 5 months ago

AI Match Score
Apply

5.0 - 9.0 years

8 - 14 Lacs

Bengaluru

Work from Office

Key Responsibilities: Splunk ITSI Implementation: Develop and configure IT Service Intelligence (ITSI) modules, including KPI creation, service trees, and notable event aggregation. SIEM Development: Design, implement, and optimize Splunk SIEM solutions for threat detection, security monitoring, and log analysis. Dashboard & Visualization: Create advanced dashboards, reports, and visualizations using Splunk SPL (Search Processing Language). Data Ingestion & Parsing: Develop data onboarding, parsing, and field extractions from various log sources, including cloud and on-prem infrastructure.

Posted 5 months ago

AI Match Score
Apply

5.0 - 9.0 years

8 - 14 Lacs

Lucknow

Work from Office

Key Responsibilities: Splunk ITSI Implementation: Develop and configure IT Service Intelligence (ITSI) modules, including KPI creation, service trees, and notable event aggregation. SIEM Development: Design, implement, and optimize Splunk SIEM solutions for threat detection, security monitoring, and log analysis. Dashboard & Visualization: Create advanced dashboards, reports, and visualizations using Splunk SPL (Search Processing Language). Data Ingestion & Parsing: Develop data onboarding, parsing, and field extractions from various log sources, including cloud and on-prem infrastructure.

Posted 5 months ago

AI Match Score
Apply

3.0 - 7.0 years

10 - 20 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Job Title: Senior Data Engineer Job Summary: We are looking for an experienced Senior Data Engineer with 5+ years of hands-on experience in cloud data engineering platforms, specifically AWS, Databricks, and Azure. The ideal candidate will play a critical role in designing, building, and maintaining scalable data pipelines and infrastructure to support our analytics and business intelligence initiatives. Key Responsibilities: Design, develop, and optimize scalable data pipelines using AWS services (e.g., S3, Glue, Redshift, Lambda). Build and maintain ETL/ELT workflows leveraging Databricks and ...

Posted 5 months ago

AI Match Score
Apply

3.0 - 7.0 years

10 - 20 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engin...

Posted 5 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies