Data Engineer

5 - 6 years

7 - 8 Lacs

Posted:6 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Experience: 5 - 6 years

Job Location: Bangalore


Job Overview:

Are you someone who enjoys designing modern data platforms, experimenting with cloud-native architectures, and turning raw data into scalable, enterprise-ready solutionsDo you thrive in fast-paced environments where innovation, rapid prototyping, and solutioning go hand in handIf so, People10 is the place for you.


P10 is seeking an experienced Lead Data Engineer with strong Cloud expertise to join our R&D and Solutions Lab , a high-impact, innovation-focused team responsible for building data-driven PoCs, reference architectures, and scalable cloud-native data platforms . This role sits at the intersection of data engineering, cloud architecture, and solution strategy , enabling P10 to deliver modern, enterprise-grade data solutions.


In this role, you will design and implement end-to-end data pipelines , orchestrate r eal-time and batch data flows , and build cloud-agnostic solutions across Azure, AWS, and GCP. You will work on multiple PoCs in parallel , collaborate closely with Sales, Pre-sales, and multiple CoE teams , contribute to RFP and proposal solutioning , and play a key role in defining P10 s data and cloud engineering strategy.


This position is ideal for a h ands-on, architecture-minded engineer who thrives in a lab environment, can operate independently, rapidly validate ideas, and confidently present data solutions to customers and internal stakeholders.



Roles & Responsibilities

Solution Design & Architecture

Design end-to-end data architectures, including ingestion, transformation, storage, and consumption layers.

Evaluate and propose scalable, cloud-native platform solutions.

Create data models, data flow diagrams, and architecture blueprints for proposals and RFP responses.

Build quick PoCs and prototypes demonstrating P10 s Data Engineering & Cloud capabilities.

Cloud Data Engineering

Build and optimize ETL/ELT pipelines using cloud-native services (Azure Data Factory, AWS Glue, GCP Dataflow).

Implement real-time streaming pipelines using Kafka, Event Hubs, Kinesis, or Pub/Sub.

Develop scalable data lakes, lakehouses, and warehouses (Delta Lake, BigQuery, Redshift, Snowflake).

Work with cloud functions, serverless compute, and event-driven architectures.

Data Processing & Modeling

Transform and process large datasets using Spark, Databricks, PySpark, or SQL engines.

Implement batch and streaming workflows for analytics or AI/ML consumption.

Ensure data quality, lineage, auditing, and governance best practices.

Collaboration & Solutioning

Work closely with Sales & Pre-Sales to provide technical feasibility, sizing, and solution inputs.

Participate in requirement gathering meetings and customer workshops as a data engineering expert.

Collaborate with AI/ML, Cloud, and IoT COEs to build integrated solutions.

Support team members by reviewing designs and unblocking technical issues.

Operationalization

Build CI/CD pipelines for data components and automate deployment workflows.

Implement cost optimization, scalability, and performance tuning for cloud data workloads.

Ensure security best practices across identity, data encryption, access control, and network design.

What We re Looking For

Technical Skills

Strong programming skills in Python or Scala, SQL, and data transformation logic.

Hands-on experience with cloud-native data services, including:

Azure: Data Factory, Synapse, Databricks, Event Hub

AWS: Glue, EMR, Redshift, Kinesis, Lambda

GCP: Dataflow, DataProc, BigQuery, Pub/Sub

Experience with Spark / PySpark, distributed data processing, and pipeline performance tuning.

Proficiency with data modeling, schema design, star/snowflake models.

Strong knowledge of data lake / lakehouse architectures (Delta Lake, Iceberg, Hudi).

Experience with streaming frameworks (Kafka, Event Hub, Kinesis).

Familiarity with containerization (Docker) and CI/CD (GitHub Actions, Azure DevOps, etc.).

Knowledge of security: IAM, VNet, encryption, governance.


Professional Skills

Ability to translate business needs into scalable data solutions.

Strong documentation, diagramming, and solution presentation skills.

Comfortable in pre-sales conversations and technical demonstrations.

Ability to work independently across multiple PoCs under tight timelines.


Good-to-Have Skills

Experience with Snowflake, Databricks advanced features (Unity Catalog).

Exposure to ML feature engineering pipelines and ML model serving.

Familiarity with event-driven microservices and API development.

Understanding of data observability tools (Great Expectations, Monte Carlo).

Experience working in multi-cloud hybrid environments.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
People10 logo
People10

IT Services and IT Consulting

NEW YORK NY

RecommendedJobs for You

noida, mumbai, pune, chennai, bengaluru

gurugram, chennai, bengaluru