SabPaisa - Data Modeler/Architect

58 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Summary

We are looking for a skilled Data Modeler / Architect with 58 years of experience in designing, implementing, and optimizing robust data architectures in the financial payments industry.The ideal candidate will have deep expertise in SQL, data modeling, ETL/ELT pipeline development, and cloud-based data platforms such as Databricks or Snowflake. You will play a key role in designing scalable data models, orchestrating reliable data workflows, and ensuring the integrity and performance of mission-critical financial datasets.This is a highly collaborative role interfacing with engineering, analytics, product, and compliance teams.

Key Responsibilities

  • Design, implement, and maintain logical and physical data models to support transactional, analytical, and reporting systems.
  • Develop and manage scalable ETL/ELT pipelines for processing large volumes of financial transaction data.
  • Tune and optimize SQL queries, stored procedures, and data transformations for maximum performance.
  • Build and manage data orchestration workflows using tools like Airflow, Dagster, or Luigi.
  • Architect data lakes and warehouses using platforms like Databricks, Snowflake, BigQuery, or Redshift.
  • Enforce and uphold data governance, security, and compliance standards (e.g., PCI-DSS, GDPR).
  • Collaborate closely with data engineers, analysts, and business stakeholders to understand data needs and deliver solutions.
  • Conduct data profiling, validation, and quality assurance to ensure clean and consistent data.
  • Maintain clear and comprehensive documentation for data models, pipelines, and architecture.

Required Skills & Qualifications

  • 58 years of experience as a Data Modeler, Data Architect, or Senior Data Engineer in the financial/payments domain.
  • Advanced SQL expertise, including query tuning, indexing, and performance optimization.
  • Proficiency in developing ETL/ELT workflows using tools such as Spark, dbt, Talend, or Informatica.
  • Experience with data orchestration frameworks: Airflow, Dagster, Luigi, etc.
  • Strong hands-on experience with cloud-based data platforms like Databricks, Snowflake, or equivalents.
  • Deep understanding of data warehousing principles: star/snowflake schema, slowly changing dimensions, etc.
  • Familiarity with financial data structures, such as payment transactions, reconciliation, fraud patterns, and audit trails.
  • Working knowledge of cloud services (AWS, GCP, or Azure) and data security best practices.
  • Strong analytical thinking and problem-solving capabilities in high-scale environments.

Preferred Qualifications

  • Experience with real-time data pipelines (e.g., Kafka, Spark Streaming).
  • Exposure to data mesh or data fabric architecture paradigms.
  • Certifications in Snowflake, Databricks, or relevant cloud platforms.
  • Knowledge of Python or Scala for data engineering tasks
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
SabPaisa logo
SabPaisa

Financial Services

New Delhi New Delhi

RecommendedJobs for You