1166 Data Pipeline Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

0 - 0 Lacs

bengaluru

Work from Office

SUMMARY Wissen Technology is Hirin g for Python + Database Developer About Wissen Technology: At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia . Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cut...

Posted 6 days ago

AI Match Score
Apply

7.0 - 12.0 years

10 - 20 Lacs

hyderabad, chennai, bengaluru

Hybrid

Skill: Data Engineer Experience: 7+ Years Location: Warangal, Bangalore, Chennai, Hyderabad, Mumbai, Pune, Delhi, Noida, Gurgaon, Kolkata, Jaipur, Jodhpur Notice Period: Immediate - 15 Days Job Description: Design & Build Data Pipelines Develop scalable ETL/ELT workflows to ingest, transform, and load data into Snowflake using SQL, Python, or data integration tools. Data Modeling Create and optimize Snowflake schemas, tables, views, and materialized views to support business analytics and reporting needs. Performance Optimization Tune Snowflake compute resources (warehouses), optimize query performance, and manage clustering and partitioning strategies. Data Quality & Validation Security & A...

Posted 6 days ago

AI Match Score
Apply

10.0 - 18.0 years

0 - 2 Lacs

noida

Remote

About the Role We are seeking a highly experienced Data Architect to design, implement, and manage large-scale data architecture solutions. The ideal candidate will have deep expertise in data modeling , cloud data platforms (Azure/AWS) , and data governance , with a strong ability to architect secure, scalable, and high-performance data ecosystems. Key Responsibilities Design and develop enterprise-grade data architectures to support analytics , AI , and business intelligence initiatives. Create and maintain conceptual, logical, and physical data models aligned with organizational strategies. Architect and optimize ETL/ELT pipelines for efficient, reliable data movement and transformation. ...

Posted 6 days ago

AI Match Score
Apply

5.0 - 8.0 years

15 - 27 Lacs

bengaluru

Work from Office

Seeking a Sr. Data Validation Engineer with 5+ years’ experience in SQL, Python, data validation, and ETL pipelines. Role involves building scalable data solutions on AWS and ensuring data integrity for a growing SaaS platform.

Posted 6 days ago

AI Match Score
Apply

4.0 - 7.0 years

15 - 27 Lacs

pune, gurugram, bengaluru

Hybrid

Salary: 15 to 25 LPA Exp: 4 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Dat...

Posted 1 week ago

AI Match Score
Apply

3.0 - 5.0 years

15 - 30 Lacs

bengaluru

Work from Office

Position summary: We are seeking a Senior Software Development Engineer – Data Engineering with 3-5 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. Key Responsibilities: Work with cloud-based data solutions (Azure, AWS, GCP). Implement data modeling and warehousing solutions. Developing and maintaining data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Designing and optimizing data storage solutions...

Posted 1 week ago

AI Match Score
Apply

4.0 - 7.0 years

5 - 10 Lacs

navi mumbai, bengaluru

Work from Office

Role & responsibilities Total Experience of 4-8years Hands on experience in python coding is must . Experience in data engineering which includes laudatory account Hands-on experience in Big Data cloud platforms like AWS(redshift, Glue, Lambda), Data Lakes, and Data Warehouses, Data Integration, data pipeline. Experience in SQL, writing code in spark engine using python,pyspark.. Experience in data pipeline and workflow management tools ( such as Azkaban, Luigi, Airflow etc.) Key Personal Attributes Business focused, Customer & Service minded Strong Consultative and Management skills Good Communication and Interpersonal skills Preferred candidate profile Interested Candidate Share your CV on...

Posted 1 week ago

AI Match Score
Apply

5.0 - 10.0 years

6 - 8 Lacs

hyderabad

Remote

Job Title : Senior-Level Data Engineer Healthcare Domain Location: Remote Option Experience: 5+ Years Employment Type: Full-Time About the Role We are looking for a Senior Data Engineer with extensive experience in healthcare data ecosystems and Databricks-based pipelines . The ideal candidate brings deep technical expertise in building large-scale data platforms, optimizing performance, and ensuring compliance with healthcare data standards (e.g., HIPAA, EDI, HCC). This role requires the ability to lead data initiatives, mentor junior engineers, and work cross-functionally with product, analytics, and compliance teams. Key Responsibilities Architect, develop, and manage large-scale, secure,...

Posted 1 week ago

AI Match Score
Apply

5.0 - 10.0 years

15 - 30 Lacs

hyderabad

Hybrid

Job Description We are seeking a highly skilled and experienced Machine Learning Engineer to lead the design and implementation of cutting-edge ML solutions across our organization. You will be responsible for prototyping and taking models all the way from proof-of-concept to production deployment. This role requires a deep understanding of ML algorithms, data processing pipelines, model optimization, and production-grade engineering practices. Role & responsibilities Build and validate ML prototypes to solve real business problems Develop, test, and optimize ML models using structured and unstructured data Design and implement scalable data pipelines and model serving infrastructure Continu...

Posted 1 week ago

AI Match Score
Apply

8.0 - 13.0 years

25 - 40 Lacs

bengaluru, thiruvananthapuram, mumbai (all areas)

Hybrid

Role: Solutions Architect - AWS Data and Analytics GTM Experience: 8+ Years, Minimum 5 years on AWS Work location: Mumbai/Bangalore/ Trivandrum (Hybrid) Responsibilities Be a trusted technical advisor to customers and solutions for complex Cloud & Data related technical challenges. Be a thought leader in architecture design and development of cloud data analytics solutions. Liaison with internal and external stakeholders to design optimized data analytics solutions on AWS cloud Partner with SMEs and Solutions Architects from leading cloud providers to present solutions to customers Support Quantiphi Sales and GTM teams from a technical perspective in building proposals and SOWs Lead discover...

Posted 1 week ago

AI Match Score
Apply

10.0 - 15.0 years

5 - 15 Lacs

bengaluru

Work from Office

Role - GCP Data Engineer Experience: 6~13 years Preferred - Immediate to 30 Days joiners Location - Bangalore, Chennai, Pune, Kolkata, Hyderabad Job Requirement: Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases. Certified in Google Professional Data Eng...

Posted 1 week ago

AI Match Score
Apply

4.0 - 5.0 years

4 - 8 Lacs

mangaluru

Hybrid

Job Description: We are seeking a skilled Cloud Engineer to join our team. The ideal candidate will provide Level 2 (L2) and Level 3 (L3) operational support for our cloud infrastructure. You will be responsible for monitoring data pipelines to ensure optimal performance and reliability. Additionally, you will work on enhancing automation processes through AIOps, implementing solutions that leverage artificial intelligence for IT operations. You will collaborate with cross-functional teams to troubleshoot and resolve issues, improve system performance, and deploy new solutions in the cloud environment. Responsibilities: - Provide L2 and L3 operational support for cloud services and infrastru...

Posted 1 week ago

AI Match Score
Apply

5.0 - 7.0 years

14 - 18 Lacs

hyderabad

Hybrid

Role & responsibilities Preferred candidate profile

Posted 1 week ago

AI Match Score
Apply

8.0 - 10.0 years

25 - 30 Lacs

bangalore rural, bengaluru

Work from Office

Loction :Mumbai Job Description: Looking for Candidates with 8 to 10 years of experience Hands on experience of implementing data pipelines using traditional DWH, Big data & Cloud ecosystem Good exposure to data architecture design, cost and size estimation Good understanding of handling Realtime/streaming pipelines Have experience of Data Quality and Data Governance Have experience of handling & interacting clients and managing vendors Having knowledge on AI/ML, GenAI is a plus Exposure of managing and leading teams

Posted 1 week ago

AI Match Score
Apply

8.0 - 13.0 years

0 Lacs

ahmedabad

Work from Office

We are seeking an AI/ML Engineer to join our software project team. The ideal candidate will be responsible for designing, developing, and integrating artificial intelligence models and algorithms into applications. This role requires collaboration with data scientists and developers to implement machine learning, natural language processing, or computer vision solutions tailored to project requirements. Key Responsibilities: Design and develop AI models and algorithms for integration into applications. Collaborate with data scientists and developers to implement machine learning, natural language processing, and computer vision solutions. Ensure AI models are trained, tested, and optimized ...

Posted 1 week ago

AI Match Score
Apply

4.0 - 7.0 years

11 - 21 Lacs

hyderabad, chennai

Hybrid

Job role: Data Engineer Location: Hyderabad/Chennai (Hybrid) Employment Type: Full-time Team: Data Engineering No-of Positions: 4 JD: Hands-on expertise in GCP data stack: BigQuery, Dataflow (Apache Beam), Dataproc, Cloud Storage, Pub/Sub, Cloud Composer (Airflow) . Strong Spark (PySpark or Scala) for batch processing on Dataproc. Solid Airflow DAG design (idempotent tasks, backfills, retries, SLAs). Advanced SQL and data modeling (star/snowflake, slowly changing dimensions, partition strategies). Proficiency in Python (preferred) or Scala/Java for data engineering. Experience with Git and CI/CD (Cloud Build/GitHub Actions/GitLab CI). Familiarity with security & governance on GCP (IAM, servi...

Posted 1 week ago

AI Match Score
Apply

9.0 - 14.0 years

15 - 30 Lacs

pune, chennai, bengaluru

Work from Office

6 years relevant in Data Engineering & Data Architecture Good understanding of Data Architecture. Experience in AWS mandatory Experience in Python & pyspark. Required Candidate profile Experience in ETL Development, ETL pipeline. Experience in Data Modelling & Data Pipeline & Data Lake. Experience in SQL. Experience in working as Individual Contributor + Lead.

Posted 1 week ago

AI Match Score
Apply

6.0 - 11.0 years

0 - 0 Lacs

bengaluru

Work from Office

SUMMARY Wissen Technology is Hirin g for Data Engineer About Wissen Technology: At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia . Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge tech...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

14 - 18 Lacs

pune

Hybrid

Job Title: GCP Data Engineer Location: Pune, India Experience: 3 to 7 Years Job Type: Full-Time Job Summary: We are looking for a highly skilled GCP Data Engineer with 3 to 7 years of experience to join our data engineering team in Pune . The ideal candidate should have strong experience working with Google Cloud Platform (GCP) , including Dataproc , Cloud Composer (Apache Airflow) , and must be proficient in Python , SQL , and Apache Spark . The role involves designing, building, and optimizing data pipelines and workflows to support enterprise-grade analytics and data science initiatives. Key Responsibilities: Design and implement scalable and efficient data pipelines on GCP , leveraging D...

Posted 1 week ago

AI Match Score
Apply

5.0 - 7.0 years

5 - 5 Lacs

thiruvananthapuram

Work from Office

Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development mainten...

Posted 1 week ago

AI Match Score
Apply

7.0 - 9.0 years

5 - 5 Lacs

bengaluru

Work from Office

Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively ...

Posted 1 week ago

AI Match Score
Apply

3.0 - 5.0 years

5 - 5 Lacs

bengaluru

Work from Office

Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc with coding skills in Python PySpark and SQL. Works independently and demonstrates proficiency in at least one domain related to data with a solid understanding of SCD concepts and data warehousing principles. Outcomes: Collaborate closely with data analysts data scientists and other stakeholders to ensure data accessibility quality and security across various data sources.rnDesign develop and maintain data pipelines...

Posted 1 week ago

AI Match Score
Apply

7.0 - 9.0 years

5 - 5 Lacs

bengaluru

Work from Office

Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively ...

Posted 1 week ago

AI Match Score
Apply

5.0 - 7.0 years

5 - 5 Lacs

bengaluru

Work from Office

Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development mainten...

Posted 1 week ago

AI Match Score
Apply

5.0 - 10.0 years

0 Lacs

bhubaneswar, pune, delhi / ncr

Hybrid

Job description Hiring for PySpark Specialist | Data Pipeline Mandatory Skills: PySpark, Python, SQL Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes an...

Posted 1 week ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies