MatchMove - Data Engineer - Numpy/Pandas

4 - 6 years

0 Lacs

Posted:21 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

You Will Get To

  • Design, build, and maintain high-performance data pipelines that integrate large-scale transactional data from our payments platform, ensuring data quality, reliability, and compliance with regulatory requirements.
  • Develop and manage distributed data processing pipelines for both high-volume data streams and batch processing workflows in a cloud-native AWS environment.
  • Implement observability and monitoring tools to ensure the reliability and scalability of the data platform, enabling stakeholders to make confident, data-driven decisions.
  • Collaborate with cross-functional teams to gather requirements and deliver business-critical data solutions, including automation of payment transactions lifecycle management, regulatory reporting, and compliance.
  • Design and implement data models across various storage paradigms to support payment transactions at scale while ensuring efficient data ingestion, transformation, and storage.
  • Maintain data integrity by implementing robust validation, testing, and error-handling mechanisms within data workflows.
  • Ensure that the data platform adheres to the highest standards for security, privacy, and governance.
  • Provide mentorship and guidance to junior engineers, driving innovation, best practices, and continuous improvement across the :
  • 4-6 years of experience in backend development and/or data platform engineering.
  • Proficiency in Python, with hands-on experience using data-focused libraries such as NumPy, Pandas,
SQLAlchemy, and Pandera to build high-quality data pipelines.
  • Strong expertise in AWS services (S3, Redshift, Lambda, Glue, Kinesis, etc.) for cloud-based data infrastructure and processing.
  • Experience with multiple data storage models, including relational, columnar, and time-series databases.
  • Proven ability to design and implement scalable, reliable, and high-performance data workflows, ensuring data integrity, performance, and availability.
  • Experience with workflow orchestrators such as Apache Airflow or Argo Workflows for scheduling and automating data pipelines.
  • Familiarity with Python-based data stack tools like DBT, Dask, Ray, Modin, and Pandas for distributed data processing.
  • Hands-on experience with data ingestion, cataloging, and change-data-capture (CDC) tools.
  • Understanding of DataOps and DevSecOps practices to ensure secure and efficient data pipeline development and deployment.
  • Strong collaboration, communication, and problem-solving skills, with the ability to work effectively
across multiple teams and geographies.
  • Experience in payments or fintech platforms is a strong plus, particularly in processing high volumes of transactional data.
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start NumPy Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You