3 - 6 years

10 - 20 Lacs

Posted:-1 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

About the Role


The ideal candidate should have a strong foundation in Python, FastAPI, and ORMs (like SQLAlchemy), with the ability to handle large datasets, design efficient APIs, and integrate real-time streaming and asynchronous task processing using tools such as Redis, Celery, and message brokers (Kafka/RabbitMQ).

You will collaborate closely with the data engineering and frontend teams to design, build, and optimize APIs and services for real-time data visualization, analytics, and monitoring.

Key Responsibilities

  • Design, develop, and maintain scalable backend services using FastAPI and Python.
  • Handle and process large volumes of structured and unstructured data efficiently.
  • Build real-time data pipelines and streaming APIs for live dashboards and analytics.
  • Integrate message brokers such as Kafka, RabbitMQ, or Redis Streams for event-driven architectures.
  • Develop asynchronous background tasks and schedulers using Celery.
  • Design and optimize database schemas using SQLAlchemy with PostgreSQL or MySQL.
  • Implement caching, data persistence, and pub/sub mechanisms using Redis.
  • Collaborate with frontend teams to expose clean, well-documented APIs for visualization dashboards.
  • Ensure high performance, scalability, and fault tolerance of backend systems.
  • Implement monitoring, logging, and performance profiling in production environments.

    Required Skills

  • Strong proficiency in Python and experience with FastAPI (or similar frameworks like Flask, Django REST).
  • Proficiency with ORMs especially SQLAlchemy and strong understanding of relational databases.
  • Experience with Redis for caching, queuing, or pub/sub systems.
    Hands-on experience with Celery for distributed or asynchronous task execution.
  • Familiarity with message brokers like Kafka, RabbitMQ, or Redis Streams.
  • Experience building real-time or live data streaming APIs using WebSockets or erver-Sent Events (SSE).
  • Strong grasp of data modeling, semantic layers (e.g., DBT, Dremio), and RESTful API design.
  • Experience handling and visualizing large datasets efficiently.
  • Understanding of containerization (Docker) and modern deployment practices (CI/CD, microservices).

    Nice to Have

    Experience with ClickHouse, TimescaleDB, or Elasticsearch for time-series or analytical workloads. Familiarity with frontend visualization tools such as Grafana, Apache Superset, or Plotly Dash. Experience integrating monitoring or metrics systems (e.g., Prometheus, Loki, OpenTelemetry). Exposure to CI/CD pipelines and microservice architectures. Knowledge of semantic modeling frameworks (like Cube.js, AtScale). Basic familiarity with data science workflows or ML model integration in production APIs.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You