Modern Data Engineer - Python, Pyspark, AWS, Data Lake

5 - 10 years

25 - 40 Lacs

Posted:5 days ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

We are currently considering only candidates based in Bangalore who can attend in-person interviews after CV shortlisting.

We're open to considering candidates with 5 to 12 years of experience, provided you are a hands-on coder with strong technical skills.

The Modern Data Engineer is responsible for designing, implementing, and maintaining scalable data architectures using cloud technologies, primarily on AWS, to support the next evolutionary stage of the Investment Process. They build robust data pipelines, optimize data storage, and access patterns, and ensure data quality while collaborating across engineering teams to deliver high-value data products


Key Responsibilities• Implement and maintain data pipelines for ingestion, transformation, and delivery• Ensure data quality through validation and monitoring processes• Collaborate with senior engineers to design scalable data solutions• Work with business analysts to understand and implement data requirements• Optimize data models and queries for performance and efficiency• Follow engineering best practices and contribute to team standards• Participate in code reviews and knowledge sharing activities• Implement data security controls and access policies• Troubleshoot and resolve data pipeline issues

5+yrs

About you
Core Technical Skills • Cloud Platforms: Expert in leveraging cloud-based data platforms (Snowflake, data Lakehouse architecture) to create enterprise lake houses • AWS Ecosystem: Advanced expertise with AWS services including Lambda, EMR, MSK, Glue, and S3 • Streaming Architecture: Experience designing event-based or streaming data architectures using Kafka • Programming: Advanced expertise in Python and SQL (Java/Scala experience welcomed) • DevOps: Expert in CI/CD pipelines to deploy infrastructure (Terraform) with test automation • Data Security: Experience implementing data access controls to meet regulatory requirements • Database Systems: Experience with both RDBMS (Oracle, Postgres, MSSQL) and NoSQL (Dynamo, OpenSearch, Redis) • Data Integration: Experience implementing CDC ingestion techniques • Orchestration: Experience using workflow tools (Airflow, Control-M, etc.) • Engineering Practices: Significant experience with GitHub, code verification, validation, and AI-assisted development • Domain Knowledge: Knowledge of investment management industry concepts, particularly security reference data, fund reference data, transactions, orders, holdings, and fund accounting

Bonus technical Skills:
• Strong experience in containerization and deploying applications to Kubernetes • Strong experience in API development using Python-based frameworks like FastAPI • Familiarity with Asset Management data domains (Security Reference, Trades, Orders Holdings, Funds, Accounting, Index etc)

Key Soft Skills:
• Problem-Solving: Leadership experience in problem-solving and technical decision-making • Communication: Strong in strategic communication and stakeholder engagement • Project Management: Experienced in overseeing project lifecycles and managing resources alongside Project Managers

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Augusta Infotech logo
Augusta Infotech

Information Technology

New Delhi

RecommendedJobs for You