10328 Data Engineering Jobs - Page 49

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 5.0 years

4 - 8 Lacs

gurugram

Work from Office

Job Requirements Someone with 3-6 years of experience running medium to large scale production environments Proven programming/scripting skills in at least one of the language (i.e Python, Java, Scala, Javascript ) Experience with any one of the cloud-based services and infrastructure (AWS, GCP, Azure) Proficiency in writing analytical SQL queries. Experience in building analytical tools that utilize data pipelines to provide key actionable insights. Knowledge of big-data tools like Hadoop, Kafka, Spark etc would be a plus. A proactive approach to spotting problems, areas for improvement, and performance bottlenecks

Posted 1 week ago

AI Match Score
Apply

5.0 - 8.0 years

17 - 20 Lacs

kolkata

Work from Office

Key Responsibilities Architect and implement scalable data solutions using GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, etc.) and Snowflake. Lead the end-to-end data architecture including ingestion, transformation, storage, governance and consumption layers. Collaborate with business stakeholders, data scientists and engineering teams to define and deliver enterprise data strategy. Design robust data pipelines (batch and real-time) ensuring high data quality, security and availability. Define and enforce data governance, data cataloging and metadata management best practices. Evaluate and select appropriate tools and technologies to optimize data architecture and cost efficien...

Posted 1 week ago

AI Match Score
Apply

8.0 - 13.0 years

10 - 14 Lacs

mumbai, bengaluru, delhi / ncr

Work from Office

Job Responsibilities : 1. ETL Development and Data Processing : - Design, develop, and implement ETL processes to extract, transform, and load data from various sources into data warehouses or analytics systems. - Maintain and enhance ETL pipelines to ensure data quality, integrity, and consistency. 2. DevOps and CI/CD Pipeline Management : - Implement and support CI/CD pipelines for Enterprise Data Warehouse (EDW) code development and deployment using GitLab. - Manage and configure workflows for code repositories, ensuring seamless code integration and promotion across environments. 3. Data Analysis and Reporting : - Apply statistical analysis and data mining techniques to uncover trends, p...

Posted 1 week ago

AI Match Score
Apply

1.0 - 3.0 years

2 - 5 Lacs

mumbai, bengaluru, delhi / ncr

Work from Office

Strong understanding of data integration concepts and ETL/ELT processes.Solid understanding of SQL and relational databases. Bachelor's degree in Computer Science, Engineering, or a related field. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes to support data integration from various sources. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet their needs. Optimize and improve existing data systems for performance, reliability, and scalability. Implement data quality and validation processes to ensure data integrity and accuracy. Manage and monitor data environments, ...

Posted 1 week ago

AI Match Score
Apply

4.0 - 9.0 years

5 - 9 Lacs

mumbai, bengaluru, delhi / ncr

Work from Office

We are seeking an experienced Senior SAP BTP Dynamic Forms Consultant to drive the creation, configuration, and management of dynamic forms using SAP Business Technology Platform (BTP). The ideal candidate will have hands-on expertise with BTP Build Apps, Workzone, and dynamic forms design, along with workflow integration and accessibility configurations. Key Responsibilities : Dynamic Forms Design & Implementation : - Design and implement dynamic forms using low-code/no-code form designers. - Configure and manage form controls, including static images, text fields, dropdown menus, calculated fields, and attachments. - Create and edit form definitions to meet complex business requirements. -...

Posted 1 week ago

AI Match Score
Apply

1.0 - 4.0 years

3 - 6 Lacs

noida

Work from Office

Contract Duration : 3 months Skills : - PySpark or Scala with Spark, Spark Architecture, Hadoop, SQL - Streaming Technologies like Kafka etc.- Proficiency in Advanced SQL (Window functions)- Airflow, S3, and Stream Sets or similar ETL tools.- Basic Knowledge on AWS IAM, AWS EMR and Snowflake. Responsibilities : - Data Pipeline Development: Design, implement, and maintain scalable and efficient data pipelines to collect, process, and store large volumes of structured and unstructured data. - Data Modeling: Develop and maintain data models, schemas, and metadata to support the organization's data initiatives. Ensure data integrity and optimize data storage and retrieval processes. - Data Integ...

Posted 1 week ago

AI Match Score
Apply

8.0 - 12.0 years

15 - 25 Lacs

pune

Work from Office

Dear Candidate , Please find the details below: Skills - Data Lead (MS Fabric - Mandatory Skill) Exp - 8 to 12 years Location Pune-Viman Nagar (WFO) Please find the job description below: Design and build data pipelines, lakehouse architectures (Bronze–Silver–Gold) , and semantic models in Microsoft Fabric. Develop data ingestion and transformation workflows using Data Factory, Dataflows Gen2, SQL, and PySpark. Integrate Fabric with Power BI for analytics and reporting, ensuring data quality, governance, and performance optimization. Implement DevOps (CI/CD, Git) practices for Fabric assets and stay updated with Microsoft Fabric roadmap (Cortex AI, Copilot, Real-Time Intelligence). Kindly Sh...

Posted 1 week ago

AI Match Score
Apply

5.0 - 10.0 years

3 - 6 Lacs

mumbai, bengaluru, delhi / ncr

Work from Office

Job Responsibilities : SQL Development : - Write medium to complex SQL queries to support data extraction, transformation, and analysis. - Optimize SQL queries for performance and maintainability.PySpark/Spark Development : - Develop, test, and deploy data processing applications using PySpark or Spark with Scala. - Implement and maintain ETL/ELT data pipelines, ensuring efficient data processing and integration.ETL/ Data Engineering Pipeline Design : - Design and implement robust ETL pipelines to support data ingestion, transformation, and loading. - Collaborate with data architects and engineers to ensure seamless data flow and integration across systems. Cloud Technology Utilization : - L...

Posted 1 week ago

AI Match Score
Apply

1.0 - 5.0 years

3 - 7 Lacs

nagpur

Work from Office

Key Skills: We are looking for an Flutter App Developer who possesses a passion for pushing mobile technologies to the limits. This Flutter app developer will work with our team of talented engineers to design and build the next generation of our mobile applications. Job Description Experience using Web Services, REST API's and Data parsing using XML, JSON etc Collaborate with cross-functional teams to define, design, and ship new features Unit-test code for robustness, including edge cases, usability, and general reliability Work on bug fixing and improving application performance Continuously discover, evaluate, and implement new technologies to maximize development efficiency Can Work Ind...

Posted 1 week ago

AI Match Score
Apply

13.0 - 18.0 years

25 - 40 Lacs

pune

Hybrid

Position Title - Lead Data Engineer ( Staff Enterprise Technology Engineer) You will work with For driving application simplification across the organization, focusing on reducing operational complexity and technical debt. They work closely with the wider digital delivery and digital core teams to identify and pursue simplification opportunities. The team collaborates with business partners to align simplification efforts with broader transformation goals and drive measurable improvements in operational efficiency. Let me tell you about the role A Staff Data Engineer designs and builds scalable data management systems that support application simplification efforts. They develop and maintain...

Posted 1 week ago

AI Match Score
Apply

8.0 - 13.0 years

27 - 42 Lacs

gurugram

Work from Office

Skills-GCP Data Engineer Experience: 6 to 12 years Location: AIA-Gurgaon Hands-on experience with GCP services, specifically BigQuery, Cloud Storage, and Composer for data pipeline orchestration Proficiency in Databricks platform with PySpark for building and optimizing large-scale ETL/ELT processes Expertise in writing and tuning complex SQL queries for data transformation, aggregation, and reporting on large datasets Experience integrating data from multiple sources such as APIs, cloud storage, and databases into a central data warehouse Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer for scheduling, monitoring, and managing data jobs Knowledge of versio...

Posted 1 week ago

AI Match Score
Apply

3.0 - 5.0 years

20 - 22 Lacs

udaipur

Work from Office

3-5 years of experience in Data Engineering or similar roles Strong foundation in cloud-native data infrastructure and scalable architecture design Build and maintain reliable, scalable ETL/ELT pipelines using modern cloud-based tools Design and optimize Data Lakes and Data Warehouses for real-time and batch processing

Posted 1 week ago

AI Match Score
Apply

5.0 - 7.0 years

15 - 25 Lacs

udaipur

Work from Office

5 to 7 years of experience in data engineering Architect and maintain scalable, secure, and reliable data platforms and pipelines Design and implement data lake/data warehouse solutions such as Redshift, BigQuery, Snowflake, or Delta Lake Build real-time and batch data pipelines using tools like Apache Airflow, Kafka, Spark, and DBT Ensure data governance, lineage, quality, and observability

Posted 1 week ago

AI Match Score
Apply

8.0 - 13.0 years

3 - 7 Lacs

mumbai, bengaluru, delhi / ncr

Work from Office

Job Responsibilities : MDM Data Vault Design and Development : - Efficiently design and develop DBT-based data models and entities by understanding project requirements. - Implement Data Vault 2.0 architecture principles, including the creation of Hubs, Links, and Satellites for master data management (MDM). Data Warehousing : - Design, develop, and maintain robust data structures to facilitate efficient data processing, storage, and retrieval within an enterprise data warehouse (Snowflake). - Collaborate with data architects to ensure data warehousing solutions align with business needs and scalability requirements. SQL Development : - Create, debug, and optimize stored procedures, function...

Posted 1 week ago

AI Match Score
Apply

5.0 - 10.0 years

3 - 6 Lacs

mumbai, bengaluru, delhi / ncr

Work from Office

We are looking for a skilled Data Engineer with expertise in PySpark, AWS, and SQL to support data processing and analytical initiatives. This role involves working closely with data engineering and data science teams to build, maintain, and optimize large-scale data pipelines and integrations on AWS. The ideal candidate will be proficient in ETL processes using PySpark and SQL, with a deep understanding of cloud data infrastructure, specifically within the AWS ecosystem. Key Responsibilities : Data Pipeline Development : - Design, build, and optimize ETL pipelines using PySpark for data ingestion, transformation, and storage on AWS. - Collaborate with stakeholders to understand data require...

Posted 1 week ago

AI Match Score
Apply

1.0 - 4.0 years

5 - 9 Lacs

mumbai, bengaluru, delhi / ncr

Work from Office

Job Responsibilities : Data Governance Expertise : - Develop and maintain comprehensive data governance strategies to ensure data quality, security, and compliance. - Establish and implement policies, standards, and guidelines related to data governance, data stewardship, and data quality management. - Lead initiatives to enhance data governance capabilities including data catalog, data dictionary, business lineage, technical lineage, and stewardship workflows. Data Catalog Management : - Configure, maintain, and optimize data catalog tools such as Alation, Collibra, Informatica, or similar platforms. - Ensure accurate metadata management and data lineage tracking across various systems. - C...

Posted 1 week ago

AI Match Score
Apply

6.0 - 10.0 years

15 - 27 Lacs

bangalore rural, bengaluru

Hybrid

JD and required Skills & Responsibilities: Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. Solve complex business problems by utilizing a disciplined development methodology. Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. Analyse the source and target system data. Map the transformation that meets the requirements. Interact with the client and onsite coordinators during different phases of a project. Design and implement product features in collaboration with business and Technology stakeholders. Anticipate...

Posted 1 week ago

AI Match Score
Apply

5.0 - 10.0 years

6 - 10 Lacs

hyderabad

Work from Office

Grade Level (for internal use): 10 The Role: Senior Data Engineer The Team: The MI Sustainability team is dedicated to driving sustainable business practices through innovative data solutions. We are a diverse group of passionate individuals committed to leveraging technology to promote environmental and social responsibility. Our team values collaboration, creativity, and a global mindset, striving to make a positive impact within our organization and the broader community. The Impact: This role contributes significantly to the business by developing and maintaining scalable data pipelines that support the increasing complexity of data. The successful candidate will enable cross-functional ...

Posted 1 week ago

AI Match Score
Apply

6.0 - 10.0 years

25 - 30 Lacs

hyderabad

Hybrid

Job Title: Senior Data Engineer AWS | Python | Data Pipelines Experience: 6 - 10 Years Location: Hyderabad Employment Type: Full-time | Permanent Notice Period: Immediate About the Role: We are seeking a highly skilled Senior Data Engineer with deep expertise in AWS cloud services and Python-based data engineering . In this role, you will architect and build scalable, automated, and high-performance data platforms that power analytics, AI, and business intelligence across the organization. Key Responsibilities: Architect and implement end-to-end data pipelines using AWS Glue, Lambda, EMR, Step Functions, and Redshift . Design and manage data lakes and warehouses on Amazon S3, Redshift, and A...

Posted 1 week ago

AI Match Score
Apply

4.0 - 8.0 years

0 - 1 Lacs

hyderabad

Remote

About the Role We are looking for an experienced AI Engineer to design, develop, and deploy AI/ML-based solutions that power our products and internal systems. You will collaborate with data scientists, software engineers, and product teams to build scalable and production-ready AI applications. Key Responsibilities Design, develop, and deploy machine learning and deep learning models for production environments. Work with large-scale datasets performing data preprocessing, feature engineering, and exploratory data analysis (EDA). Collaborate with cross-functional teams to integrate AI models into production-grade systems and services (e.g., via APIs, microservices). Optimize model performan...

Posted 1 week ago

AI Match Score
Apply

1.0 - 3.0 years

1 - 5 Lacs

bengaluru

Hybrid

The Team: The Automotive Insights - Supply Chain and Technology and IMR department at S&P Global is dedicated to delivering critical intelligence and comprehensive analysis of the automotive industry's supply chain and technology. Our team provides actionable insights and data-driven solutions that empower clients to navigate the complexities of the automotive ecosystem, from manufacturing and logistics to technological innovations and market dynamics. We collaborate closely with industry stakeholders to ensure our research supports strategic decision-making and drives growth within the automotive sector. Join us to be at the forefront of transforming the automotive landscape with cutting-ed...

Posted 1 week ago

AI Match Score
Apply

3.0 - 5.0 years

8 - 17 Lacs

pune

Hybrid

Company Introduction Coditas is a new-age, offshore product development organization, offering services pertaining to the entire software development life cycle. Headquartered in Pune, Coditas works with clients across the globe. We attribute our organic growth to an engineering-driven culture and steadfast philosophies around writing clean code, designing intuitive user experiences, and letting the work speak for itself. Job Description We are looking for data engineers who have the right attitude, aptitude, skills, empathy, compassion, and hunger for learning. Build products in the data analytics space. A passion for shipping high-quality data products, interest in the data products space;...

Posted 1 week ago

AI Match Score
Apply

5.0 - 10.0 years

0 Lacs

haryana

On-site

As a member of Spectral Consultants, a US-based management consulting firm, you will be responsible for leading end-to-end project delivery, including design and deployment. Your role will involve architecting solutions using AWS/Azure technologies such as EMR, Glue, and Redshift. Additionally, you will guide and mentor technical teams, interface with senior stakeholders, and work in Agile environments. Your qualifications should include: - 5+ years of experience in tech consulting or solution delivery - Proficiency in Python/Scala and cloud platforms like AWS/Azure - Hands-on experience in data engineering and distributed systems - Excellent communication and leadership skills If you are a ...

Posted 1 week ago

AI Match Score
Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

Role Overview: You will be joining as a Principal Snowflake Data Engineer & Data Engineering Lead at Health Catalyst, a fast-growing company dedicated to solving national-level healthcare problems. In this role, you will play a crucial part in leading and mentoring cross-functional teams focused on developing innovative tools to support the mission of improving healthcare performance, cost, and quality. Your responsibilities will include owning the architectural vision and implementation strategy for Snowflake-based data platforms, leading the design, optimization, and maintenance of ELT pipelines, and driving best practices in schema design and data modeling. Key Responsibilities: - Own the...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be working as an Infrastructure Consulting Practitioner, supporting clients in designing, building, and running an agile, scalable IT infrastructure. Your role will involve standardizing and optimizing the infrastructure to enhance operational efficiencies, improve employee performance, and meet dynamic business demands. You will be responsible for implementing infrastructure improvements across workplace, network, data center, and operations. Additionally, you will provide resources for managing and running the infrastructure on a managed or capacity services basis. **Roles & Responsibilities:** - Perform independently and become subject matter expert (SME). - Actively participate ...

Posted 1 week ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies