Data Scientist

8 years

0 Lacs

Posted:4 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Location:

Experience:

Industry:


Job Summary


Data Modeler / Architect

You will play a key role in designing scalable data models, orchestrating reliable data workflows, and ensuring the integrity and performance of mission-critical financial datasets. This is a highly collaborative role interfacing with engineering, analytics, product, and compliance teams.


Key Responsibilities


  • Design, implement, and maintain 

    logical and physical data models

     to support transactional, analytical, and reporting systems.
  • Develop and manage scalable 

    ETL/ELT pipelines

     for processing large volumes of 

    financial transaction data

    .
  • Tune and optimize 

    SQL queries, stored procedures

    , and data transformations for maximum performance.
  • Build and manage 

    data orchestration workflows

     using tools like 

    Airflow, Dagster, or Luigi

    .
  • Architect data lakes and warehouses using platforms like 

    Databricks, Snowflake, BigQuery

    , or 

    Redshift

    .
  • Enforce and uphold 

    data governance, security, and compliance standards

     (e.g., PCI-DSS, GDPR).
  • Collaborate closely with 

    data engineers, analysts, and business stakeholders

     to understand data needs and deliver solutions.
  • Conduct 

    data profiling, validation

    , and quality assurance to ensure clean and consistent data.
  • Maintain clear and comprehensive 

    documentation

     for data models, pipelines, and architecture.


Required Skills & Qualifications


  • 5–8 years of experience as a 

    Data Modeler, Data Architect

    , or 

    Senior Data Engineer

     in the 

    financial/payments

     domain.
  • Advanced SQL expertise, including 

    query tuning, indexing

    , and 

    performance optimization

    .
  • Proficiency in developing ETL/ELT workflows using tools such as 

    Spark, dbt, Talend, or Informatica

    .
  • Experience with data orchestration frameworks: 

    Airflow, Dagster, Luigi

    , etc.
  • Strong hands-on experience with cloud-based data platforms like 

    Databricks, Snowflake

    , or equivalents.
  • Deep understanding of 

    data warehousing principles

    : star/snowflake schema, slowly changing dimensions, etc.
  • Familiarity with 

    financial data structures

    , such as payment transactions, reconciliation, fraud patterns, and audit trails.
  • Working knowledge of 

    cloud services

     (AWS, GCP, or Azure) and 

    data security best practices

    .
  • Strong analytical thinking and problem-solving capabilities in high-scale environments.


Preferred Qualifications


  • Experience with 

    real-time data pipelines

     (e.g., Kafka, Spark Streaming).
  • Exposure to 

    data mesh

     or 

    data fabric

     architecture paradigms.
  • Certifications in 

    Snowflake, Databricks

    , or relevant cloud platforms.
  • Knowledge of 

    Python or Scala

     for data engineering tasks.


Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
SabPaisa logo
SabPaisa

Financial Services

New Delhi New Delhi

RecommendedJobs for You

Bangalore City, Bengaluru, Karnataka

Hyderabad, Telangana

Hyderabad, Telangana, India