4 - 7 years

7 - 12 Lacs

Posted:1 month ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Location :

Years of Experience:

Employment Type :

Job Summary:

Data Engineer

Key Responsibilities:

  • Design, develop, and maintain scalable data pipelines and ETL workflows using

    Scala, Python, and PySpark

    .
  • Integrate and transform large-scale mutual fund and investment data from multiple sources into a cloud data warehouse (

    AWS /

    Snowflake

    ).
  • Collaborate with business stakeholders, data analysts, and product teams to understand data requirements related to

    mutual fund operations

    .
  • Build and optimize data systems for performance, reliability, and regulatory compliance.
  • Work on cloud infrastructure (preferably

    AWS / Snowflake

    ) to deploy and manage data engineering solutions.
  • Ensure data quality, integrity, and consistency across systems.
  • Contribute to the modernization of legacy systems and migration to cloud-based platforms.
  • Maintain documentation and support data governance practices.
  • Participate in Agile ceremonies, provide regular updates, and collaborate across cross-functional teams.
  • Mentor junior engineers and share best practices within the team.

Required Skills & Experience:

  • 4 - 5 years

    of experience in data engineering roles, preferably in the

    mutual fund or broader asset management industry

    .
  • Strong programming skills in

    Sql

    ,

    Scala

    ,

    Python

    , and

    PySpark

    .
  • Proven experience with

    cloud platforms

    , especially

    AWS

    (e.g., S3, Glue, Lambda, EC2, Redshift etc..).
  • Hands-on experience with

    Snowflake

    including schema design, performance tuning, and data sharing.
  • Strong understanding of mutual fund domain concepts such as

    NAV calculations, portfolio data, transactions, compliance, and regulatory reporting

    .
  • Experience building and optimizing data pipelines for large-scale data processing.
  • Familiarity with orchestration tools like

    Airflow

    ,

    AWS Step Functions

    , or similar.
  • Knowledge of data governance, metadata management, and data quality frameworks.
  • Experience with CI/CD pipelines and DevOps best practices is a plus.

Preferred Qualifications:

  • Bachelor's or master's degree in

    computer science, Information Systems

    , or a related field.
  • Knowledge of

    SQL

    , and

    data modelling

    .
  • Prior experience integrating

    third-party mutual fund data sources

    (e.g., Morningstar, Bloomberg, CAMS, etc.) is a plus.

Soft Skills:

  • Excellent verbal and written communication skills.
  • Ability to understand business requirements and translate them into technical solutions.
  • Proactive, self-motivated, and able to work independently or in a team environment.
  • Strong analytical and problem-solving skills.
  • Ability to prioritize tasks in a fast-paced environment.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Kfin Technologies logo
Kfin Technologies

Financial Services

Hyderabad

RecommendedJobs for You